<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Mon, Sep 1, 2014 at 6:18 AM, Åsmund Ervik <span dir="ltr"><<a href="mailto:asmund.ervik@ntnu.no" target="_blank">asmund.ervik@ntnu.no</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br>
<br>
On 01. sep. 2014 12:51, Matthew Knepley wrote:<br>
> On Mon, Sep 1, 2014 at 3:22 AM, Åsmund Ervik <<a href="mailto:asmund.ervik@ntnu.no">asmund.ervik@ntnu.no</a>> wrote:<br>
><br>
>><br>
>> Subsequently I installed HDF5 v. 1.8.12 using my OS package manager and<br>
>> then tried to configure with "--with-hdf5<br>
>> --with-hdf5-dir=/path/to/system/hdf5" using PETSc 3.5.1. This time both<br>
>> configure and make were successful, but "make test" fails with loads of<br>
>> undefined references to PETSc HDF5 stuff (see below). configure.log and<br>
>> make.log attached also for this case (error-with-hdf5.tar.gz)<br>
>><br>
><br>
> It did not build src/sys/classes/viewer/impls/hdf5/, clearly from the log.<br>
> However, it<br>
> should have. We have been trying to understand why Make behaves in a bad way<br>
> after configuration failure. It should go away with make clean and another<br>
> make.<br>
<br>
Thanks Matt, make clean and then make did result in make test passing<br>
all tests.<br>
<br>
However, I am still unable to use HDF5 functionality. When I compile and<br>
run e.g. vec/vec/examples/tutorials/ex10.c it works fine with binary<br>
output, but with HDF5 I get the error message below.<br>
<br>
I'm guessing this is some incompatibility between the various MPI, HDF5<br>
and PETSc versions, i.e. that the HDF5 from my OS is not using the same<br>
MPI as PETSc. Is it simple to edit something in PETSc and have<br>
"./configure --download-hdf5" get the most recent HDF5 library which<br>
compiles on my machine?<br></blockquote><div><br></div><div>Yes, I cannot reproduce, so it must be something like that. Can you reconfigure</div><div>using the --download version?</div><div><br></div><div> Thanks,</div>
<div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Error message:<br>
$ ./ex10 -hdf5<br>
Vec Object:Test_Vec 1 MPI processes<br>
type: seq<br>
0<br>
1<br>
2<br>
3<br>
4<br>
5<br>
6<br>
7<br>
8<br>
9<br>
10<br>
11<br>
12<br>
13<br>
14<br>
15<br>
16<br>
17<br>
18<br>
19<br>
writing vector in hdf5 to vector.dat ...<br>
[0]PETSC ERROR: #1 PetscViewerFileSetName_HDF5() line 81 in<br>
/opt/petsc/optim_gfortran/src/sys/classes/viewer/impls/hdf5/hdf5v.c<br>
[0]PETSC ERROR: #2 PetscViewerFileSetName() line 624 in<br>
/opt/petsc/optim_gfortran/src/sys/classes/viewer/impls/ascii/filev.c<br>
[0]PETSC ERROR: #3 PetscViewerHDF5Open() line 163 in<br>
/opt/petsc/optim_gfortran/src/sys/classes/viewer/impls/hdf5/hdf5v.c<br>
[0]PETSC ERROR: #4 main() line 66 in<br>
/opt/petsc/optim_gfortran/src/vec/vec/examples/tutorials/ex10.c<br>
[0]PETSC ERROR: ----------------End of Error Message -------send entire<br>
error message to petsc-maint@mcs.anl.gov----------<br>
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 0<br>
[unset]: aborting job:<br>
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 0<br>
HDF5: infinite loop closing library<br>
<br>
D,T,AC,FD,P,FD,P,FD,P,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD<br>
<br>
Regards,<br>
Åsmund<br>
<br>
</blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener
</div></div>