On Fri, Aug 5, 2011 at 2:56 AM, huyaoyu <span dir="ltr"><<a href="mailto:huyaoyu1986@gmail.com">huyaoyu1986@gmail.com</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
Satish,<br>
<br>
Yes, PETSc outputs the fortran warings when I invoke "make test". As you<br>
said, I will just ignore them. But it also says that "Error detected<br>
during compile or link".<br>
<br>
=========PETSc output while testing=======<br>
huyaoyu@ubuntu:~/Downloads/petsc-3.1-p8$ make<br>
PETSC_DIR=/home/huyaoyu/Downloads/petsc-3.1-p8<br>
PETSC_ARCH=linux-gnu-c-debug test<br>
Running test examples to verify correct installation<br>
C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1<br>
MPI process<br>
C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2<br>
MPI processes<br>
--------------Error detected during compile or<br>
link!-----------------------<br>
... OTHER STUFF AND FORTRAN WARNINGS...<br>
Fortran example src/snes/examples/tutorials/ex5f run successfully with 1<br>
MPI process<br>
Completed test examples<br>
=========End of PETSc output while testing==<br>
<br>
So, I just ignore this. Is that OK?<br></blockquote><div><br></div><div>Yes, it is just indicating that warning.</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
Thanks!<br>
<br>
HuYaoyu<br>
<br>
> > Thanks to Satish Balay and Jed Brown. Your advices are pretty good!<br>
> ><br>
> > I remove the petsc directory and deal.II directory. Re-build petsc and<br>
> > deal.II with the mpi installed system wide. The configuration lines are<br>
> > as follows:<br>
> ><br>
> > For PETSc:<br>
> > ./config/configure.py --with-cc=/usr/bin/mpicc --with-fc=/usr/bin/mpif90<br>
> > --download-f-blas-lapack=1 --with-shared<br>
> ><br>
> > For deal.II:<br>
> > ./configure --enable-shared --disable-threads --with-petsc=$PETSC_DIR<br>
> > --with-petsc-arch=$(PETSC_ARCH) --with-p4est=PATH-TO-P4EST --with-mpi<br>
> ><br>
> > I will use p4est for grid distribution and I checked the configure<br>
> > output of deal.II. The output says deal.II will use /usr/bin/mpicc for<br>
> > CC variable and /usr/bin/mpiCC for CXX variable. And deal.II says:<br>
> > =================deal.II configure output=========================<br>
> > checking for PETSc library<br>
> > directory... /home/huyaoyu/Downloads/petsc-3.1-p8<br>
> > checking for PETSc version... 3.1.0<br>
> > checking for PETSc library architecture... linux-gnu-c-debug<br>
> > checking for PETSc libmpiuni library... not found<br>
> > checking for consistency of PETSc and deal.II MPI settings... yes<br>
> > checking for PETSc scalar complex... no<br>
> > ================= end of deal.II configure output=================<br>
> ><br>
> > After the compilation of PETSc and deal.II, I tried to compile the<br>
> > specific example program of deal.II which will use PETSc. And everything<br>
> > works well. No segmentation error any more! Great!<br>
> ><br>
> > However, the test of PETSc trigger the same error while trying to<br>
> > perform task in 2 process. Is it because I am using PETSc on a single<br>
> > machine but not a cluster of computers?<br>
><br>
> You mean the compiler warning for the fortan example in 'make test'?<br>
><br>
> You can ignore that.<br>
><br>
> Satish<br>
<br>
<br>
<br>
</blockquote></div><br><br clear="all"><br>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener<br>