[Darshan-users] h5py python application error : Darshan failed to map symbol: H5get_libversion

Shane Snyder ssnyder at mcs.anl.gov
Wed Apr 19 18:14:11 CDT 2017


Hi Pramod,

The DARSHAN_EXCLUDE_DIRS mechanism you found is not currently part of 
the Darshan master branch -- the documentation there is for a fork of 
Darshan being managed independently at ECMWF. I have asked the developer 
who manages it to contribute it back upstream to Darshan master so 
hopefully we can include it in our next release. Being able to tell 
Darshan to filter out files in certain directories at runtime seems 
pretty useful in general, so will be nice to have that mechanism.

In the mean time, your temporary workaround seems like the best bet. I'm 
glad it wasn't too much trouble for you to figure out how to manually 
hack Darshan to get this working.

Thanks,
--Shane

On 04/18/2017 02:30 PM, pramod kumbhar wrote:
> Hi Shane,
>
> Thanks for information. That was most likely the case because python was
> loading lots of python modules (.pyc) at runtime and I saw lots of 
> read-only files
> with darshan-parser --file-list.
>
> I was trying to exclude the directories under PYTHONPATH
> and I came across DARSHAN_EXCLUDE_DIRS documented here 
> <https://software.ecmwf.int/wiki/display/UDOC/How+to+use+Darshan+to+profile+IO#HowtouseDarshantoprofileIO-excludedir>.
> But it doesn't work and I don't see any reference in the source code 
> either.
>
> As quick workaround, in my local installation I hard-coded directory 
> paths in darshan-runtime/lib/darshan-core.c.
> I wonder what is the option to exclude or include specific directories.
>
> Regards,
> Pramod
>
>
> On Thu, Apr 13, 2017 at 7:31 PM, Shane Snyder <ssnyder at mcs.anl.gov 
> <mailto:ssnyder at mcs.anl.gov>> wrote:
>
>     Do you know how many files this application is creating?
>
>     By default, Darshan should stop tracking new file records on a
>     specific process after either tracking more than 1,024 files for a
>     given module or if it has exhausted all of its allocated memory.
>     This is to keep its memory footprint bounded. The warning message
>     you get from darshan-job-summary seems to imply this is what's
>     going on, but if your application isn't opening that many files,
>     maybe something else is wrong?
>
>     Thanks,
>     --Shane
>
>
>     On 04/09/2017 10:49 AM, pramod kumbhar wrote:
>>     It seems like hdf5 library is not loaded into memory and hence
>>     dlsym() is failing ?
>>     I tried LD_PRELOAD-ing both darshan as well as hdf5 library and
>>     the above example is working fine.
>>
>>     darshan-parser --file-list show the hdf5 file and then I use
>>     darshan-convert --file to get profile of specific hdf5 that I am
>>     interested in.
>>
>>     But, while profiling actual application, I have another issue. In
>>     the profile of large application, darshan-parser
>>     --file-list doesn't show the hdf5 created by application. What
>>     could be wrong?
>>
>>     The application is using scipy, numpy, Image etc. Initially I
>>     thought importing scipy modules somehow causing the issue but I
>>     am not able to reproduce it with small isolated test.
>>
>>     Could someone provide any hints? (note that the job_name.darshan
>>     log file is generated but pdf from darshan-job-summary.pl
>>     <http://darshan-job-summary.pl> says "This Darshan log contains
>>     incomplete data which...").
>>
>>     Thanks in advance!
>>
>>     -Pramod
>>
>>     On Sun, Apr 9, 2017 at 1:02 PM, pramod kumbhar
>>     <pramod.s.kumbhar at gmail.com <mailto:pramod.s.kumbhar at gmail.com>>
>>     wrote:
>>
>>         Hello All,
>>
>>         I have used darshan for parallel c/c++ application without
>>         issue but trying to analyse python(3) application first time.
>>         This application uses h5py.
>>
>>         I compiled darshan with --enable-HDF5-pre-1.10
>>         (or --enable-HDF5-post-1.10).
>>
>>         When I now run simple h5py test serially I get :
>>
>>         $ python3 test.py
>>         Darshan failed to map symbol: H5get_libversion
>>
>>         If I don't enable hdf5 during configure then I see profiles
>>         being generated.
>>         What am I missing here? Do I need parallel h5py application?
>>         Any hints will be helpful!
>>
>>         Regards,
>>         Pramod
>>
>>         p.s. h5py test program
>>
>>         import h5py
>>         a = [1,2,3,4,5]
>>         h5f = h5py.File('data.h5', 'w')
>>         h5f.create_dataset('dataset_1', data=a)
>>         h5f.close()
>>
>>
>>
>>
>>     _______________________________________________
>>     Darshan-users mailing list
>>     Darshan-users at lists.mcs.anl.gov
>>     <mailto:Darshan-users at lists.mcs.anl.gov>
>>     https://lists.mcs.anl.gov/mailman/listinfo/darshan-users
>>     <https://lists.mcs.anl.gov/mailman/listinfo/darshan-users>
>     _______________________________________________ Darshan-users
>     mailing list Darshan-users at lists.mcs.anl.gov
>     <mailto:Darshan-users at lists.mcs.anl.gov>
>     https://lists.mcs.anl.gov/mailman/listinfo/darshan-users
>     <https://lists.mcs.anl.gov/mailman/listinfo/darshan-users> 
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/darshan-users/attachments/20170419/37e593c5/attachment.html>


More information about the Darshan-users mailing list