<div dir="ltr">I think I just found out what happened.<div>There is another mpi "openmpi" that already exists on my computer.</div><div>After I remove it then all back to normal.</div><div><br></div><div>Thanks for your assistance,</div><div><br></div><div>Tsung-Hsing Chen</div><div><br></div><br><div class="gmail_quote"><div class="gmail_attr" dir="ltr">Tsung-Hsing Chen <<a target="_blank" href="mailto:barrydog505@gmail.com">barrydog505@gmail.com</a>> 於 2020年2月26日 週三 下午6:21寫道:<br></div><blockquote style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex" class="gmail_quote"><div dir="ltr">Unfortunately, it still no work for me.<div>what I do is </div><div>first : ./configure --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --download-fblaslapack</div><div>then make ......, and make check.</div><div>the output has shown that "C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 MPI processes".</div><div>last, I type "make -f gmakefile print VAR=MPIEXEC".</div><div><br></div><div>And I went running ex2, the problem still exists.</div><div>Is there needed to do anything else before I run ex2?</div><div>By the way, should I move to <a target="_blank" href="mailto:petsc-maint@mcs.anl.gov">petsc-maint@mcs.anl.gov</a> for the upcoming question?</div><div><br></div><br><div class="gmail_quote"><div class="gmail_attr" dir="ltr">Stefano Zampini <<a target="_blank" href="mailto:stefano.zampini@gmail.com">stefano.zampini@gmail.com</a>> 於 2020年2月26日 週三 下午4:50寫道:<br></div><blockquote style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex" class="gmail_quote"><div dir="ltr">First, make sure you compiled with support for MPI by running make check<div><br>[szampini@localhost petsc]$ make check<br>Running test examples to verify correct installation<br>Using PETSC_DIR=/home/szampini/Devel/jointinversion/pkgs/petsc and PETSC_ARCH=arch-joint<br>C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 MPI process<br>C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 MPI processes<br>C/C++ example src/snes/examples/tutorials/ex19 run successfully with hypre<br>C/C++ example src/snes/examples/tutorials/ex19 run successfully with mumps<br>Completed test examples<br><br>if you have the "2 MPI processes" output, then type</div><div><br>[szampini@localhost petsc]$ make -f gmakefile print VAR=MPIEXEC<br>mpiexec<br><br>For me, mpiexec is system-wide.<br></div></div><br><div class="gmail_quote"><div class="gmail_attr" dir="ltr">Il giorno mer 26 feb 2020 alle ore 11:38 Tsung-Hsing Chen <<a target="_blank" href="mailto:barrydog505@gmail.com">barrydog505@gmail.com</a>> ha scritto:<br></div><blockquote style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex" class="gmail_quote"><div dir="ltr">So, What should I do to use the correct mpiexec?<div>Am I configure petsc with the wrong way or something should be done?</div></div><br><div class="gmail_quote"><div class="gmail_attr" dir="ltr">Stefano Zampini <<a target="_blank" href="mailto:stefano.zampini@gmail.com">stefano.zampini@gmail.com</a>> 於 2020年2月26日 週三 下午4:26寫道:<br></div><blockquote style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex" class="gmail_quote"><div dir="ltr">This is what I get<div><br></div><div>[szampini@localhost tutorials]$ mpiexec -n 2 ./ex2 -ksp_monitor_short -m 5 -n 5 -ksp_gmres_cgs_refinement_type refine_always<br> 0 KSP Residual norm 2.73499 <br> 1 KSP Residual norm 0.795482 <br> 2 KSP Residual norm 0.261984 <br> 3 KSP Residual norm 0.0752998 <br> 4 KSP Residual norm 0.0230031 <br> 5 KSP Residual norm 0.00521255 <br> 6 KSP Residual norm 0.00145783 <br> 7 KSP Residual norm 0.000277319 <br>Norm of error 0.000292349 iterations 7<br></div><div><br></div><div>When I sequentially, I get (same output as yours)</div><div><br></div><div>[szampini@localhost tutorials]$ mpiexec -n 1 ./ex2 -ksp_monitor_short -m 5 -n 5 -ksp_gmres_cgs_refinement_type refine_always<br> 0 KSP Residual norm 3.21109 <br> 1 KSP Residual norm 0.93268 <br> 2 KSP Residual norm 0.103515 <br> 3 KSP Residual norm 0.00787798 <br> 4 KSP Residual norm 0.000387275 <br>Norm of error 0.000392701 iterations 4<br></div><div><br></div><div>This means you are using the wrong mpiexec </div></div><br><div class="gmail_quote"><div class="gmail_attr" dir="ltr">Il giorno mer 26 feb 2020 alle ore 11:17 Tsung-Hsing Chen <<a target="_blank" href="mailto:barrydog505@gmail.com">barrydog505@gmail.com</a>> ha scritto:<br></div><blockquote style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex" class="gmail_quote"><div dir="ltr">Hi, <div><br></div><div>I tried to run the example in ksp/examples/tutorials/ex2.</div><div>I run the code with : mpiexec -n 2 ./ex2 -ksp_monitor_short -m 5 -n 5 -ksp_gmres_cgs_refinement_type refine_always</div><div><br></div><div>the output is :</div><div> 0 KSP Residual norm 3.21109 <br> 1 KSP Residual norm 0.93268 <br> 2 KSP Residual norm 0.103515 <br> 3 KSP Residual norm 0.00787798 <br> 4 KSP Residual norm 0.000387275 <br>Norm of error 0.000392701 iterations 4<br> 0 KSP Residual norm 3.21109 <br> 1 KSP Residual norm 0.93268 <br> 2 KSP Residual norm 0.103515 <br> 3 KSP Residual norm 0.00787798 <br> 4 KSP Residual norm 0.000387275 <br>Norm of error 0.000392701 iterations 4<br></div><div><br></div><div>My output(above) is twice as the ksp/examples/tutorials/output/ex2_4.out.</div><div>Is this the right answer that should come out?</div><div><br></div><div>Thanks in advance,</div><div><br></div><div>Tsung-Hsing Chen</div></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr">Stefano</div>
</blockquote></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr">Stefano</div>
</blockquote></div></div>
</blockquote></div></div>