[petsc-users] Question about PETSc installs and MPI

John Yawney jyawney123 at gmail.com
Sat Jul 19 10:35:42 CDT 2014


Hi Barry, Matt, and Satish,

Thank you for the quick responses. It's true I forgot the return 0 at the
end. I remember I was looking at it before and meant to add the line in but
I sent the original version of the code.

I had a working version of the mpi compilers on my computer but I
understand what you meant Satish. I'll definitely use those configure
options for other installs.

Thanks again,
John
On Jul 18, 2014 9:58 PM, "Matthew Knepley" <knepley at gmail.com> wrote:

> On Fri, Jul 18, 2014 at 8:16 PM, John Yawney <jyawney123 at gmail.com> wrote:
>
>> Hello,
>>
>> I had a question about PETSc installations. On my local computer I
>> configured PETSc (v 3.4.2) using the options:
>>
>> ./configure --with-cc=mpicc --with-cxx=mpic++ --download-f-blas-lapack
>> --download-mpich --download-hypre
>>
>> I wrote a test program that defines a vector using DMDAs, computes a dot
>> product, exchanges halo elements, and computes a low-order FD derivative of
>> the vector. Under my installation of PETSc everything works fine. For
>> some reason, when my colleagues run the program, they get segmentation
>> fault errors. If they change the y and z boundary types to GHOSTED as well,
>> they get the program to run until the end (seg faults at the end though)
>> but they get a local value of the dot product. I've attached the main.cpp
>> file for this script.
>>
>> When they installed their versions of PETSc they didn't use the
>> --download-mpich option but instead used either:
>> ./configure --download-f-blas-lapack --with-scalar-type=complex
>> or with the
>> option: --with-mpi-dir=/home/kim/anaconda/pkgs/mpich2-1.3-py27_0
>>
>> Could this be causing a problem with the parallelization under PETSc?
>>
>
> I have run your code on my machine up to P=8 and used valgrind. No
> problems turned up other
> than the fact that you need a "return 0" is missing from main().
>
> If there are still problems, please have them send a stack trace.
>
>   Thanks,
>
>      Matt
>
>
>> Thanks for the help and sorry for the long question.
>>
>> Best regards,
>> John
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140719/6d4ba04a/attachment.html>


More information about the petsc-users mailing list