[petsc-users] Question about PETSc installs and MPI

John Yawney jyawney123 at gmail.com
Fri Jul 18 20:16:54 CDT 2014


Hello,

I had a question about PETSc installations. On my local computer I
configured PETSc (v 3.4.2) using the options:

./configure --with-cc=mpicc --with-cxx=mpic++ --download-f-blas-lapack
--download-mpich --download-hypre

I wrote a test program that defines a vector using DMDAs, computes a dot
product, exchanges halo elements, and computes a low-order FD derivative of
the vector. Under my installation of PETSc everything works fine. For some
reason, when my colleagues run the program, they get segmentation fault
errors. If they change the y and z boundary types to GHOSTED as well, they
get the program to run until the end (seg faults at the end though) but
they get a local value of the dot product. I've attached the main.cpp file
for this script.

When they installed their versions of PETSc they didn't use the
--download-mpich option but instead used either:
./configure --download-f-blas-lapack --with-scalar-type=complex
or with the option: --with-mpi-dir=/home/kim/anaconda/pkgs/mpich2-1.3-py27_0

Could this be causing a problem with the parallelization under PETSc?

Thanks for the help and sorry for the long question.

Best regards,
John
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140718/f138bc95/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: main.cpp
Type: text/x-c++src
Size: 7391 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140718/f138bc95/attachment.cpp>


More information about the petsc-users mailing list