<div>Hi,</div>
<div> </div>
<div>I've tried to use the shared version of mpich2 which I installed seperately (due to above problem) with PETSc. The command is </div>
<div> </div>
<div>./config/configure.py --with-fc=/lsftmp/g0306332/inter/fc/bin/ifort --with-blas-lapack-dir=/lsftmp/g0306332/inter/mkl/ --with-mpi-dir=/lsftmp/g0306332/mpich2-l32 --with-x=0 --with-shared</div>
<div> </div>
<div>During the test, this error msg was shown:</div>
<div> </div>
<div>gcc -c -fPIC -Wall -Wwrite-strings -g3 -I/nas/lsftmp/g0306332/petsc-2.3.2-p8 -I/nas/lsftmp/g0306332/petsc-2.3.2-p8/bmake/linux-mpich2 -I/nas/lsftmp/g0306332/petsc-2.3.2-p8/include -I/lsftmp/g0306332/mpich2-l32/include -D__SDIR__="src/snes/examples/tutorials/"
ex19.c<br>gcc -fPIC -Wall -Wwrite-strings -g3 -o ex19 ex19.o -Wl,-rpath,/nas/lsftmp/g0306332/petsc-2.3.2-p8/lib/linux-mpich2 -L/nas/lsftmp/g0306332/petsc-2.3.2-p8/lib/linux-mpich2 -lpetscsnes -lpetscksp -lpetscdm -lpetscmat -lpetscvec -lpetsc -Wl,-rpath,/lsftmp/g0306332/mpich2-l32/lib -L/lsftmp/g0306332/mpich2-l32/lib -lmpich -lnsl -laio -lrt -Wl,-rpath,/lsftmp/g0306332/inter/mkl/lib/32 -L/lsftmp/g0306332/inter/mkl/lib/32 -lmkl_lapack -lmkl_def -lguide -lvml -lpthread -lm -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3 -L/usr/lib/gcc-lib/i386-pc-linux/3.2.3 -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../.. -L/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../.. -ldl -lgcc_eh -Wl,-rpath,"/usr/lib/gcc-lib/i386-pc-linux/3.2.3" -Wl,-rpath,"/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../.." -Wl,-rpath,/lsftmp/g0306332/inter/fc/lib -L/lsftmp/g0306332/inter/fc/lib -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3/ -L/usr/lib/gcc-lib/i386-pc-linux/3.2.3/ -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../../ -L/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../../ -lifport -lifcore -limf -lm -lipgo -lirc -lgcc_s -lirc_s -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3 -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../.. -lm -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3 -L/usr/lib/gcc-lib/i386-pc-linux/3.2.3 -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../.. -L/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../.. -ldl -lgcc_eh -ldl
<br>/nas/lsftmp/g0306332/petsc-2.3.2-p8/lib/linux-mpich2/libpetsc.so: undefined reference to `mpi_conversion_fn_null_'<br>collect2: ld returned 1 exit status<br> </div>
<div>Then I tried to my fortran example.</div>
<div> </div>
<div>I realised that if I compile using ifort (using command similar to "make ex1f", there is no problem but the result shown that 4 processors are running 4 <strong>individual</strong> codes.</div>
<div> </div>
<div>I then tried to use the mpif90 in the mpich2 directory. compiling was ok but during linking the error was:</div>
<div> </div>
<div> /lsftmp/g0306332/mpich2-l32/bin/mpif90 -fPIC -g -o ex2f ex2f.o -Wl,-rpath,/nas/lsftmp/g0306332/petsc-2.3.2-p8/lib/linux-mpich2 -L/nas/lsftmp/g0306332/petsc-2.3.2-p8/lib/linux-mpich2 -lpetscksp -lpetscdm -lpetscmat -lpetscvec -lpetsc -Wl,-rpath,/lsftmp/g0306332/mpich2-l32/lib -L/lsftmp/g0306332/mpich2-l32/lib -lmpich -lnsl -laio -lrt -Wl,-rpath,/lsftmp/g0306332/inter/mkl/lib/32 -L/lsftmp/g0306332/inter/mkl/lib/32 -lmkl_lapack -lmkl_def -lguide -lvml -lpthread -lm -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3 -L/usr/lib/gcc-lib/i386-pc-linux/3.2.3 -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../.. -L/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../.. -ldl -lgcc_eh -Wl,-rpath,"/usr/lib/gcc-lib/i386-pc-linux/3.2.3" -Wl,-rpath,"/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../.." -Wl,-rpath,/lsftmp/g0306332/inter/fc/lib -L/lsftmp/g0306332/inter/fc/lib -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3/ -L/usr/lib/gcc-lib/i386-pc-linux/3.2.3/ -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../../ -L/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../../ -lifport -lifcore -limf -lm -lipgo -lirc -lgcc_s -lirc_s -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3 -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../.. -lm -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3 -L/usr/lib/gcc-lib/i386-pc-linux/3.2.3 -Wl,-rpath,/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../.. -L/usr/lib/gcc-lib/i386-pc-linux/3.2.3/../../.. -ldl -lgcc_eh -ldl
<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__clog10q'<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__cexp10q'<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__csqrtq'
<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__ccoshq'<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__ctanhq'<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__ccosq'
<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__clogq'<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__csinhq'<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__ctanq'
<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__cpowq'<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__exp10q'<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__cexpq'
<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__cabsq'<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__fabsq'<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__csinq'
<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__ldexpq'<br>/lsftmp/g0306332/inter/fc/lib/libifcore.so.5: undefined reference to `__frexpq'<br> </div>
<div> </div>
<div>So what is wrong? I am a novice in mpi so I hope someone can give some advice. Thank you.</div>
<div> </div>
<div>PS: Has the "make clean" on MPICH been updated on BuildSystem/config/packages/MPI.py?<br><br> </div>
<div><span class="gmail_quote">On 1/13/07, <b class="gmail_sendername">Ben Tay</b> <<a href="mailto:zonexo@gmail.com">zonexo@gmail.com</a>> wrote:</span>
<blockquote class="gmail_quote" style="PADDING-LEFT: 1ex; MARGIN: 0px 0px 0px 0.8ex; BORDER-LEFT: #ccc 1px solid">
<div>Thanks Barry & Aron.</div>
<div> </div>
<div>I've tried to install mpich2 on a scratch directory and it finished in a short while.<br><br> </div>
<div><span class="e" id="q_1101c055f9a1d8db_1">
<div><span class="gmail_quote">On 1/13/07, <b class="gmail_sendername">Barry Smith</b> <<a onclick="return top.js.OpenExtLink(window,event,this)" href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>
> wrote:</span>
<blockquote class="gmail_quote" style="PADDING-LEFT: 1ex; MARGIN: 0px 0px 0px 0.8ex; BORDER-LEFT: #ccc 1px solid"><br>Ben,<br><br>This is partially our fault. We never run "make clean" on MPICH after the<br>install so there are lots of .o and .a files lying around. I've updated
<br>BuildSystem/config/packages/MPI.py to run make clean after the install.<br><br> Barry<br><br>On Sat, 13 Jan 2007, Aron Ahmadia wrote:<br><br>> Hi Ben,<br>><br>> My PETSc install on an OS X machine requires about 343 MB of space,
<br>> about 209 MB of which is MPICH. Unfortunately this has the potential<br>> of exceeding 500 MB temporarily I believe as the make process<br>> generates a lot of object files during the software build.<br>>
<br>> I think what you want to do is compile and install your own copy of<br>> MPICH (using a scratch directory or whatever tools you have at your<br>> disposal), then use the --with-mpi-dir=/location/to/mpich/install
<br>> argument into configure.<br>><br>> I've never staged a PETSc build on a machine with an extremely limited<br>> quota, the developers might have some suggestions on how to do this.<br>><br>> ~A<br>
><br>> On 1/13/07, Ben Tay <<a onclick="return top.js.OpenExtLink(window,event,this)" href="mailto:zonexo@gmail.com" target="_blank">zonexo@gmail.com</a>> wrote:<br>> > Hi,<br>> ><br>> > I am trying to compile PETSc with mpi using --download-mpich=1 in linux. The
<br>> > command is<br>> ><br>> ><br>> ><br>> > ./config/configure.py<br>> > --with-fc=/lsftmp/g0306332/inter/fc/bin/ifort<br>> > --with-blas-lapack-dir=/lsftmp/g0306332/inter/mkl/
<br>> > --download-mpich=1 --with-x=0 --with-shared<br>> ><br>> > It displays:<br>> ><br>> > =================================================================================<br>> > Running configure on MPICH; this may take several
<br>> > minutes<br>> > =================================================================================<br>> > =================================================================================<br>> > Running make on MPICH; this may take several
<br>> > minutes<br>> > =================================================================================<br>> ><br>> > then it says disk quota exceeded. I've about 450mb free space, which is all
<br>> > filled up when the error shows. May I know how much disk space is required?<br>> ><br>> > Also can I compile just mpich on a scratch directory and then moved it to<br>> > the PETSc externalpackages directory? Or do I have to compile everything
<br>> > (including PETSc) on a scratch directory and moved it my my directory?<br>> ><br>> > thank you.<br>><br>><br><br></blockquote></div><br></span></div></blockquote></div><br>