[petsc-users] Different behavior of code on different machines

Zhang, Junchao jczhang at mcs.anl.gov
Sat Jul 20 11:47:31 CDT 2019


Did you used the same number of MPI ranks, same build options on your pc and on cluster? If not, you can try to align options on your pc with those on your cluster to see if you can reproduce the error on your pc. You can also try valgrind to see if there are memory errors like use of uninitialized variables etc.

--Junchao Zhang


On Sat, Jul 20, 2019 at 11:35 AM Yuyun Yang <yyang85 at stanford.edu<mailto:yyang85 at stanford.edu>> wrote:
I already tested on my pc with multiple processors and it works fine. I used the command $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 2 since I configured my PETSc with MPICH, but my local computer has openmpi.

Best,
Yuyun

From: Zhang, Junchao <jczhang at mcs.anl.gov<mailto:jczhang at mcs.anl.gov>>
Sent: Saturday, July 20, 2019 9:14 AM
To: Yuyun Yang <yyang85 at stanford.edu<mailto:yyang85 at stanford.edu>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Different behavior of code on different machines

You need to test on your personal computer with multiple MPI processes (e.g., mpirun -n 2 ...) before moving to big machines. You may also need to configure petsc with --with-dedugging=1 --COPTFLAGS="-O0 -g" etc to ease debugging.
--Junchao Zhang


On Sat, Jul 20, 2019 at 11:03 AM Yuyun Yang via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
Hello team,

I’m encountering a problem with my code’s behavior on multiple processors. When I run it on my personal computer it works just fine, but when I use it on our computing cluster it produces an error (in one of the root-finding functions, an assert statement is not satisfied) and aborts.

If I just run on one processor then both machines can run the code just fine, but they give different results (maybe due to roundoff errors).

I’m not sure how to proceed with debugging (since I usually do it on my own computer which didn’t seem to encounter a bug) and would appreciate your advice. Thank you!

Best regards,
Yuyun
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190720/313dc511/attachment.html>


More information about the petsc-users mailing list