[petsc-users] The question of the output from ksp/ex2.c

Tsung-Hsing Chen barrydog505 at gmail.com
Wed Feb 26 04:59:31 CST 2020


I think I just found out what happened.
There is another mpi "openmpi" that already exists on my computer.
After I remove it then all back to normal.

Thanks for your assistance,

Tsung-Hsing Chen


Tsung-Hsing Chen <barrydog505 at gmail.com> 於 2020年2月26日 週三 下午6:21寫道:

> Unfortunately, it still no work for me.
> what I do is
> first : ./configure --with-cc=gcc --with-cxx=g++ --with-fc=gfortran
> --download-mpich --download-fblaslapack
> then make ......, and make check.
> the output has shown that "C/C++ example src/snes/examples/tutorials/ex19
> run successfully with 2 MPI processes".
> last, I type "make -f gmakefile print VAR=MPIEXEC".
>
> And I went running ex2, the problem still exists.
> Is there needed to do anything else before I run ex2?
> By the way, should I move to petsc-maint at mcs.anl.gov for the upcoming
> question?
>
>
> Stefano Zampini <stefano.zampini at gmail.com> 於 2020年2月26日 週三 下午4:50寫道:
>
>> First, make sure you compiled with support for MPI by running make check
>>
>> [szampini at localhost petsc]$ make check
>> Running test examples to verify correct installation
>> Using PETSC_DIR=/home/szampini/Devel/jointinversion/pkgs/petsc and
>> PETSC_ARCH=arch-joint
>> C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1
>> MPI process
>> C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2
>> MPI processes
>> C/C++ example src/snes/examples/tutorials/ex19 run successfully with hypre
>> C/C++ example src/snes/examples/tutorials/ex19 run successfully with mumps
>> Completed test examples
>>
>> if you have the "2 MPI processes" output, then type
>>
>> [szampini at localhost petsc]$ make -f gmakefile print VAR=MPIEXEC
>> mpiexec
>>
>> For me, mpiexec is system-wide.
>>
>> Il giorno mer 26 feb 2020 alle ore 11:38 Tsung-Hsing Chen <
>> barrydog505 at gmail.com> ha scritto:
>>
>>> So, What should I do to use the correct mpiexec?
>>> Am I configure petsc with the wrong way or something should be done?
>>>
>>> Stefano Zampini <stefano.zampini at gmail.com> 於 2020年2月26日 週三 下午4:26寫道:
>>>
>>>> This is what I get
>>>>
>>>> [szampini at localhost tutorials]$ mpiexec -n 2 ./ex2 -ksp_monitor_short
>>>> -m 5 -n 5 -ksp_gmres_cgs_refinement_type refine_always
>>>>   0 KSP Residual norm 2.73499
>>>>   1 KSP Residual norm 0.795482
>>>>   2 KSP Residual norm 0.261984
>>>>   3 KSP Residual norm 0.0752998
>>>>   4 KSP Residual norm 0.0230031
>>>>   5 KSP Residual norm 0.00521255
>>>>   6 KSP Residual norm 0.00145783
>>>>   7 KSP Residual norm 0.000277319
>>>> Norm of error 0.000292349 iterations 7
>>>>
>>>> When I sequentially, I get (same output as yours)
>>>>
>>>> [szampini at localhost tutorials]$ mpiexec -n 1 ./ex2 -ksp_monitor_short
>>>> -m 5 -n 5 -ksp_gmres_cgs_refinement_type refine_always
>>>>   0 KSP Residual norm 3.21109
>>>>   1 KSP Residual norm 0.93268
>>>>   2 KSP Residual norm 0.103515
>>>>   3 KSP Residual norm 0.00787798
>>>>   4 KSP Residual norm 0.000387275
>>>> Norm of error 0.000392701 iterations 4
>>>>
>>>> This means you are using the wrong mpiexec
>>>>
>>>> Il giorno mer 26 feb 2020 alle ore 11:17 Tsung-Hsing Chen <
>>>> barrydog505 at gmail.com> ha scritto:
>>>>
>>>>> Hi,
>>>>>
>>>>> I tried to run the example in ksp/examples/tutorials/ex2.
>>>>> I run the code with : mpiexec -n 2 ./ex2 -ksp_monitor_short -m 5 -n 5
>>>>> -ksp_gmres_cgs_refinement_type refine_always
>>>>>
>>>>> the output is :
>>>>>   0 KSP Residual norm 3.21109
>>>>>   1 KSP Residual norm 0.93268
>>>>>   2 KSP Residual norm 0.103515
>>>>>   3 KSP Residual norm 0.00787798
>>>>>   4 KSP Residual norm 0.000387275
>>>>> Norm of error 0.000392701 iterations 4
>>>>>   0 KSP Residual norm 3.21109
>>>>>   1 KSP Residual norm 0.93268
>>>>>   2 KSP Residual norm 0.103515
>>>>>   3 KSP Residual norm 0.00787798
>>>>>   4 KSP Residual norm 0.000387275
>>>>> Norm of error 0.000392701 iterations 4
>>>>>
>>>>> My output(above) is twice as
>>>>> the ksp/examples/tutorials/output/ex2_4.out.
>>>>> Is this the right answer that should come out?
>>>>>
>>>>> Thanks in advance,
>>>>>
>>>>> Tsung-Hsing Chen
>>>>>
>>>>
>>>>
>>>> --
>>>> Stefano
>>>>
>>>
>>
>> --
>> Stefano
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200226/6d8d4d28/attachment.html>


More information about the petsc-users mailing list