[Darshan-users] undefined reference to __wrap_printf

Phil Carns carns at mcs.anl.gov
Fri Jul 21 15:33:54 CDT 2017


Great news- I'm glad it's resolved!

On 07/21/2017 04:19 PM, Teng Wang wrote:
> Hi Phil,
>
> Thanks for the suggestion. I solved this issue finally
> by recompiling zlib.
>
> Teng
>
> On Fri, Jul 21, 2017 at 1:05 PM, Phil Carns <carns at mcs.anl.gov 
> <mailto:carns at mcs.anl.gov>> wrote:
>
>     It might be helpful to compare the mpich configure/compile lines
>     between your two configurations too, in case there is some
>     difference in mpich libraries that is triggering an issue.
>
>     thanks,
>     -Phil
>
>
>     On 07/20/2017 05:35 PM, Shane Snyder wrote:
>>
>>     On 07/20/2017 04:19 PM, Shane Snyder wrote:
>>>     Hi Teng,
>>>
>>>     First of all, I actually just got darshan 3.1.4 to properly
>>>     generate a log when using static linking. I'm not sure what was
>>>     leading to the problem I had earlier, but after reconfiguring
>>>     and rebuilding the library, it works for me now. I also
>>>     confirmed the nightly tests were working correctly -- the log
>>>     files were going to another spot that I wasn't expecting, so
>>>     missed them the first time around. You may want to try something
>>>     similar -- just rebuilding everything from scratch and trying
>>>     again. I've also run into this problem in the past whenever I
>>>     was accidentally pointing to an mpich build that defaults to
>>>     using shared libraries rather than the static ones, so that
>>>     could be another thing to check.
>>>
>>>     As for the error you are getting with Darshan 3.1.3, I've
>>>     actually never seen someone hit that before. That error would be
>>>     due to the libz compression functions (deflate) returning some
>>>     sort of error. Not exactly sure what could cause that, but is
>>>     the application opening a really large amount of files or
>>>     something like that? I suppose it's possible the compression
>>>     routines could be running out of memory if there is too much log
>>>     data they are trying to compress, but have never seen that
>>>     happen before. If you aren't already, it might be worth trying a
>>>     test program that does very basic I/O just to confirm that we
>>>     can get Darshan working. One that we frequently use is
>>>     mpi-io-test -- there is a version of this code in the darshan
>>>     source @ darshan-test/regression/test-cases/src. You could try
>>>     just building that and running it (it doesn't take any command
>>>     line parameters) to see if that works?
>>
>>     Oh, actually I just noticed in your emails that you were using
>>     mpi-io-test? That is pretty bizarre...
>>
>>     What system are you running this on? A cluster? Your own personal
>>     machine? If possible, you might try running somewhere else so we
>>     can confirm the issue is some problem with the environment on the
>>     system you have been using... I can't think of why the deflate
>>     routines would fail when running something simple like
>>     mpi-io-test. But if needed, we could probably get you a patch or
>>     something that tries to give us more details on the libz error
>>     you are seeing.
>>
>>     --Shane
>>
>>>
>>>     Thanks,
>>>     --Shane
>>>
>>>     On 07/20/2017 03:52 PM, Teng Wang wrote:
>>>>     Hi Shane,
>>>>
>>>>     Thanks for your help on this. I tried v3.1.3 and compiled
>>>>     successfully.
>>>>     But when I ran the application, there was no *.log generated
>>>>     under the
>>>>     log directory, and I also got a warning from the application:
>>>>
>>>>     darshan library warning: unable to compress job data
>>>>
>>>>
>>>>     May I know how to fix this?
>>>>
>>>>     Thanks,
>>>>     Teng
>>>>
>>>>     On Thu, Jul 20, 2017 at 11:51 AM, Shane Snyder
>>>>     <ssnyder at mcs.anl.gov <mailto:ssnyder at mcs.anl.gov>> wrote:
>>>>
>>>>         Hi Teng,
>>>>
>>>>         Thanks for reporting this. This is a really strange issue,
>>>>         but I can actually reproduce it, too. I'm going to keep
>>>>         digging and see if I can find the problem. Will keep you
>>>>         updated.
>>>>
>>>>         The strange thing is that I was able to use version 3.1.4
>>>>         successfully right before making the release available
>>>>         (i.e., the log @
>>>>         darshan-test/example-output/mpi-io-test-x86_64-3.1.4.darshan
>>>>         was generated using version 3.1.4 with static linking).
>>>>         Also strange is that our nightly testing is not catching
>>>>         this issue...
>>>>
>>>>         FYI, version 3.1.3 and earlier work fine for me if you
>>>>         really need a quick resolution -- version 3.1.4 just fixed
>>>>         a couple of non-critical bugs in some darshan log parsing
>>>>         utilities, so there shouldn't be much functionally
>>>>         different between the versions.
>>>>
>>>>         --Shane
>>>>
>>>>
>>>>         On 07/20/2017 12:43 PM, Teng Wang wrote:
>>>>>
>>>>>         Hi,
>>>>>
>>>>>
>>>>>         I'm having trouble when trying to statically link
>>>>>
>>>>>         Darshan with mpich3.3 applications following the instructions
>>>>>
>>>>>         here. Could you give me any suggestion on how to fix it?
>>>>>
>>>>>
>>>>>         http://www.mcs.anl.gov/research/projects/darshan/docs/darshan-runtime.html
>>>>>         <http://www.mcs.anl.gov/research/projects/darshan/docs/darshan-runtime.html>
>>>>>
>>>>>
>>>>>         Here is my steps:
>>>>>
>>>>>         1. Compile and install darshan 3.1.4
>>>>>
>>>>>         tar -xvzf darshan-3.1.4.tar.gz
>>>>>
>>>>>         cd darshan-3.1.4/darshan-runtime
>>>>>
>>>>>         ./configure --with-mem-align=8 --with-log-path=<log
>>>>>         directory> --prefix=<install directory>
>>>>>         --with-jobid-env=PBS_JOBID --disable-cuserid CC=mpicc
>>>>>
>>>>>         make
>>>>>
>>>>>         make install
>>>>>
>>>>>
>>>>>         2. Create log directory, which successfully create the
>>>>>         directory
>>>>>
>>>>>         darshan-mk-log-dirs.pl <http://darshan-mk-log-dirs.pl/>
>>>>>
>>>>>
>>>>>         3. Generate wrapper
>>>>>
>>>>>         darshan-gen-cc.pl <http://darshan-gen-cc.pl/> `which
>>>>>         mpicc` --output mpicc.darshan
>>>>>
>>>>>
>>>>>         4. Compile application (mpi-io-test.c) using mpicc.darshan
>>>>>
>>>>>         mpicc.darshan mpi-io-test.c -o mpi-io-test
>>>>>
>>>>>
>>>>>         After step 4, the following issue happened:
>>>>>
>>>>>         initthread.c:(.text+0xc59d): undefined reference to
>>>>>         `__wrap_fprintf'
>>>>>
>>>>>         initthread.c:(.text+0xc5bd): undefined reference to
>>>>>         `__wrap_fputs'
>>>>>
>>>>>         /global/homes/t/user/software_install/mpich3.3a-static/lib/libmpi.a(lib_libmpi_la-finalize.o):
>>>>>         In function `MPIR_Add_finalize':
>>>>>
>>>>>         finalize.c:(.text+0x226e): undefined reference to
>>>>>         `__wrap_fputs'
>>>>>
>>>>>         finalize.c:(.text+0x227a): undefined reference to
>>>>>         `__wrap_fflush'
>>>>>
>>>>>         finalize.c:(.text+0x2286): undefined reference to
>>>>>         `__wrap_fflush'
>>>>>
>>>>>         /global/homes/t/user/software_install/mpich3.3a-static/lib/libmpi.a(lib_libmpi_la-util.o):
>>>>>         In function `MPIDI_OFI_control_handler':
>>>>>
>>>>>         util.c:(.text+0x17bc4): undefined reference to
>>>>>         `__wrap_fprintf'
>>>>>
>>>>>         /global/homes/t/user/software_install/mpich3.3a-static/lib/libmpi.a(lib_libmpi_la-ch4_globals.o):
>>>>>         In function `MPID_Abort':
>>>>>
>>>>>         ch4_globals.c:(.text+0x38b): undefined reference to
>>>>>         `__wrap_fputs'
>>>>>
>>>>>         ch4_globals.c:(.text+0x397): undefined reference to
>>>>>         `__wrap_fflush'
>>>>>
>>>>>         ch4_globals.c:(.text+0x3a3): undefined reference to
>>>>>         `__wrap_fflush'
>>>>>
>>>>>         /tmp/ccyntLoA.o: In function `main':
>>>>>
>>>>>         mpi-io-test.c:(.text+0x5ae): undefined reference to
>>>>>         `__wrap_printf'
>>>>>
>>>>>         mpi-io-test.c:(.text+0x819): undefined reference to
>>>>>         `__wrap_printf'
>>>>>
>>>>>         /global/homes/t/user/software_install/mpich3.3a-static/lib/libmpi.a(lib_libmpi_la-contextid.o):
>>>>>         In function `MPIR_Free_contextid':
>>>>>
>>>>>         contextid.c:(.text+0x1c3f): undefined reference to
>>>>>         `__wrap_fputs'
>>>>>
>>>>>         contextid.c:(.text+0x1c4b): undefined reference to
>>>>>         `__wrap_fflush'
>>>>>
>>>>>         contextid.c:(.text+0x1c57): undefined reference to
>>>>>         `__wrap_fflush'
>>>>>
>>>>>
>>>>>
>>>>>         mpicc.darshan -show mpi-io-test.c -o mpi-io-test gave:
>>>>>
>>>>>
>>>>>         gcc -L/usr/lib64/slurmpmi
>>>>>         -L/global/common/cori/software/libfabric/1.4.1/gnu/lib
>>>>>         mpi-io-test.c -o mpi-io-test
>>>>>         -L/global/homes/t/user/software_install/darshan/lib
>>>>>         -ldarshan -lz
>>>>>         -Wl,@/global/homes/t/user/software_install/darshan/share/ld-opts/darshan-base-ld-opts
>>>>>         -I/global/homes/t/user/software_install/mpich3.3a-static/include
>>>>>         -L/global/homes/t/user/software_install/mpich3.3a-static/lib
>>>>>         -lmpi -lpmi -lpmi -lpthread -lfabric -lrt -lpmi
>>>>>         -L/global/homes/t/user/software_install/darshan/lib
>>>>>         -Wl,--start-group -ldarshan -ldarshan-stubs
>>>>>         -Wl,--end-group -lz -lrt -lpthread
>>>>>
>>>>>
>>>>>         The strange thing is when I do
>>>>>
>>>>>         nm libdarshan.a, it did show the functions (e.g.
>>>>>         __wrap_printf) were linked to
>>>>>
>>>>>         libdarshan.a.
>>>>>
>>>>>
>>>>>         *0000000000002bf0 T __wrap__IO_getc*
>>>>>
>>>>>         *0000000000002d80 T __wrap__IO_putc*
>>>>>
>>>>>         *00000000000030b0 T __wrap___isoc99_fscanf*
>>>>>
>>>>>         *0000000000001990 T __wrap_fclose*
>>>>>
>>>>>         *00000000000011d0 T __wrap_fdopen*
>>>>>
>>>>>         *0000000000001810 T __wrap_fflush*
>>>>>
>>>>>         *0000000000002a60 T __wrap_fgetc*
>>>>>
>>>>>         *0000000000003690 T __wrap_fgets*
>>>>>
>>>>>         *0000000000000db0 T __wrap_fopen*
>>>>>
>>>>>         *0000000000000fc0 T __wrap_fopen64*
>>>>>
>>>>>         *00000000000026b0 T __wrap_fprintf*
>>>>>
>>>>>         *0000000000001c80 T __wrap_fputc*
>>>>>
>>>>>         *0000000000001fa0 T __wrap_fputs*
>>>>>
>>>>>         *00000000000028d0 T __wrap_fread*
>>>>>
>>>>>         *00000000000013d0 T __wrap_freopen*
>>>>>
>>>>>         *00000000000015f0 T __wrap_freopen64*
>>>>>
>>>>>         *00000000000032d0 T __wrap_fscanf*
>>>>>
>>>>>         *0000000000003920 T __wrap_fseek*
>>>>>
>>>>>         *0000000000003a30 T __wrap_fseeko*
>>>>>
>>>>>         *0000000000003b40 T __wrap_fseeko64*
>>>>>
>>>>>         *0000000000003c50 T __wrap_fsetpos*
>>>>>
>>>>>
>>>>>         Thanks,
>>>>>
>>>>>         Teng
>>>>>
>>>>>
>>>>>
>>>>>         _______________________________________________
>>>>>         Darshan-users mailing list
>>>>>         Darshan-users at lists.mcs.anl.gov
>>>>>         <mailto:Darshan-users at lists.mcs.anl.gov>
>>>>>         https://lists.mcs.anl.gov/mailman/listinfo/darshan-users
>>>>>         <https://lists.mcs.anl.gov/mailman/listinfo/darshan-users>
>>>>
>>>>
>>>>         _______________________________________________
>>>>         Darshan-users mailing list
>>>>         Darshan-users at lists.mcs.anl.gov
>>>>         <mailto:Darshan-users at lists.mcs.anl.gov>
>>>>         https://lists.mcs.anl.gov/mailman/listinfo/darshan-users
>>>>         <https://lists.mcs.anl.gov/mailman/listinfo/darshan-users>
>>>>
>>>>
>>>
>>>
>>>
>>>     _______________________________________________
>>>     Darshan-users mailing list
>>>     Darshan-users at lists.mcs.anl.gov
>>>     <mailto:Darshan-users at lists.mcs.anl.gov>
>>>     https://lists.mcs.anl.gov/mailman/listinfo/darshan-users
>>>     <https://lists.mcs.anl.gov/mailman/listinfo/darshan-users>
>>
>>
>>
>>     _______________________________________________
>>     Darshan-users mailing list
>>     Darshan-users at lists.mcs.anl.gov
>>     <mailto:Darshan-users at lists.mcs.anl.gov>
>>     https://lists.mcs.anl.gov/mailman/listinfo/darshan-users
>>     <https://lists.mcs.anl.gov/mailman/listinfo/darshan-users>
>
>
>
>     _______________________________________________
>     Darshan-users mailing list
>     Darshan-users at lists.mcs.anl.gov
>     <mailto:Darshan-users at lists.mcs.anl.gov>
>     https://lists.mcs.anl.gov/mailman/listinfo/darshan-users
>     <https://lists.mcs.anl.gov/mailman/listinfo/darshan-users>
>
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/darshan-users/attachments/20170721/d2fe5c7b/attachment-0001.html>


More information about the Darshan-users mailing list