It&#39;s resolved.<div><br></div><div>Actually, I have different libxml2 (old) version installed on the compute nodes although I have installed the same libxml2 on the node where I launched the MPIRUN.<div><br></div><div>After installing the same libxml2 library on the compute nodes, the error went away.</div>
<div><br></div><div>Thanks,</div><div>- Chansup<br><br><div class="gmail_quote">On Tue, Dec 20, 2011 at 11:39 AM, CB <span dir="ltr">&lt;<a href="mailto:cbalways@gmail.com">cbalways@gmail.com</a>&gt;</span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
I build the MPICH2 package on a VM running Fedora Core 11 and installed on a cluster of Dell servers running the same OS.<div><br></div><div>Both VM and Dell servers have the same verions of the libxml2 library.</div><div>

<br></div><div>Do you need any other informaiton?</div><div><br></div><div>Thanks,</div><div>- Chansup</div><div class="HOEnZb"><div class="h5"><div><br><br><div class="gmail_quote">On Tue, Dec 20, 2011 at 11:21 AM, Darius Buntinas <span dir="ltr">&lt;<a href="mailto:buntinas@mcs.anl.gov" target="_blank">buntinas@mcs.anl.gov</a>&gt;</span> wrote:<br>

<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">I&#39;m not sure.  There might be something wrong with your libxml2 library.  What kind of machine are you using?<br>
<br>
-d<br>
<div><div><br>
<br>
On Dec 20, 2011, at 9:58 AM, CB wrote:<br>
<br>
&gt;<br>
&gt; When I run an example hello program on a node with two MPI processes, it generates some warnings about no version information although the program ran successfully.<br>
&gt;<br>
&gt; I am wondering how to suppress the warning message?<br>
&gt;<br>
&gt; $ mpirun -np 2  -machinefile ./hostsfile ./hello-mpich2-c.exe<br>
&gt;<br>
&gt; /usr/local/MPI/mpich2/bin/hydra_pmi_proxy: /usr/lib64/libxml2.so.2: no version information available (required by /usr/local/MPI/mpich2/bin/hydra_pmi_proxy)<br>
&gt; /usr/local/MPI/mpich2/bin/hydra_pmi_proxy: /usr/lib64/libxml2.so.2: no version information available (required by /usr/local/MPI/mpich2/bin/hydra_pmi_proxy)<br>
&gt; /usr/local/MPI/mpich2/bin/hydra_pmi_proxy: /usr/lib64/libxml2.so.2: no version information available (required by /usr/local/MPI/mpich2/bin/hydra_pmi_proxy)<br>
&gt; /usr/local/MPI/mpich2/bin/hydra_pmi_proxy: /usr/lib64/libxml2.so.2: no version information available (required by /usr/local/MPI/mpich2/bin/hydra_pmi_proxy)<br>
&gt; Processor 0 on compute-0-0 out of 2<br>
&gt; Processor 1 on compute-0-2 out of 2<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; Here is my build information:<br>
&gt;<br>
&gt; $ mpirun -info<br>
&gt; HYDRA build details:<br>
&gt;     Version:                                 1.4.1p1<br>
&gt;     Release Date:                            Thu Sep  1 13:53:02 CDT 2011<br>
&gt;     CC:                              gcc<br>
&gt;     CXX:                             c++<br>
&gt;     F77:                             gfortran<br>
&gt;     F90:                             f95<br>
&gt;     Configure options:                       &#39;--prefix=/usr/local/MPI/mpich2-1.4.1p1&#39; &#39;--enable-shared&#39; &#39;--enable-checking=release&#39; &#39;--disable-option-checking&#39; &#39;CC=gcc&#39; &#39;CFLAGS= -O2&#39; &#39;LDFLAGS= &#39; &#39;LIBS=-lrt -lpthread &#39; &#39;CPPFLAGS= -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/mpl/include -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/mpl/include -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/openpa/src -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/openpa/src -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/mpid/ch3/include -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/mpid/ch3/include -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/mpid/common/datatype -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/mpid/common/datatype -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/mpid/common/locks -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/mpid/common/locks -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/mpid/ch3/channels/nemesis/include -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/mpid/ch3/channe<br>


 ls/nemesis/include -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/mpid/ch3/channels/nemesis/nemesis/include -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/mpid/ch3/channels/nemesis/nemesis/include -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/mpid/ch3/channels/nemesis/nemesis/utils/monitor -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/mpid/ch3/channels/nemesis/nemesis/utils/monitor -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/util/wrappers -I/home/x/Documents/MPICH2/mpich2-1.4.1p1/src/util/wrappers&#39;<br>


&gt;     Process Manager:                         pmi<br>
&gt;     Launchers available:                      ssh rsh fork slurm ll lsf sge manual persist<br>
&gt;     Topology libraries available:              hwloc plpa<br>
&gt;     Resource management kernels available:    user slurm ll lsf sge pbs<br>
&gt;     Checkpointing libraries available:<br>
&gt;     Demux engines available:                  poll select<br>
&gt;<br>
&gt;<br>
&gt; Thanks,<br>
&gt; - Chansup<br>
&gt;<br>
&gt;<br>
</div></div>&gt; _______________________________________________<br>
&gt; mpich-discuss mailing list     <a href="mailto:mpich-discuss@mcs.anl.gov" target="_blank">mpich-discuss@mcs.anl.gov</a><br>
&gt; To manage subscription options or unsubscribe:<br>
&gt; <a href="https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss" target="_blank">https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss</a><br>
<br>
_______________________________________________<br>
mpich-discuss mailing list     <a href="mailto:mpich-discuss@mcs.anl.gov" target="_blank">mpich-discuss@mcs.anl.gov</a><br>
To manage subscription options or unsubscribe:<br>
<a href="https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss" target="_blank">https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss</a><br>
</blockquote></div><br></div>
</div></div></blockquote></div><br></div></div>