[petsc-users] running error

paul zhang paulhuaizhang at gmail.com
Mon Dec 1 15:33:23 CST 2014


Hi Jed,

Now I see PETSc is compiled correctly. However, when I attempted to call
"petscksp.h" in my own program (quite simple one), it failed for some
reason. Attached you can see two cases. The first is just the test of MPI,
which is fine. The second is one added PETSc, which has segment fault as it
went to

        MPI_Comm_rank (MPI_COMM_WORLD, &rank);        /* get current
process id */

Can you shed some light? The MPI version is 1.8.3.

Thanks,
Paul

























Huaibao (Paul) Zhang
*Gas Surface Interactions Lab*
Department of Mechanical Engineering
University of Kentucky,
Lexington,
KY, 40506-0503
*Office*: 216 Ralph G. Anderson Building
*Web*:gsil.engineering.uky.edu

On Mon, Dec 1, 2014 at 4:20 PM, paul zhang <paulhuaizhang at gmail.com> wrote:

>
> Sorry. I should reply it to the lists.
>
> [hzh225 at dlxlogin2-2 petsc-3.5.2]$ make
> PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2 PETSC_ARCH=linux-gnu-intel
> test
>
> Running test examples to verify correct installation
> Using PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2 and
> PETSC_ARCH=linux-gnu-intel
> C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 MPI
> process
> C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 MPI
> processes
> Fortran example src/snes/examples/tutorials/ex5f run successfully with 1
> MPI process
> Completed test examples
> =========================================
> Now to evaluate the computer systems you plan use - do:
> make PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2
> PETSC_ARCH=linux-gnu-intel streams NPMAX=<number of MPI processes you
> intend to use>
>
>
> Huaibao (Paul) Zhang
> *Gas Surface Interactions Lab*
> Department of Mechanical Engineering
> University of Kentucky,
> Lexington,
> KY, 40506-0503
> *Office*: 216 Ralph G. Anderson Building
> *Web*:gsil.engineering.uky.edu
>
> On Mon, Dec 1, 2014 at 4:18 PM, Jed Brown <jed at jedbrown.org> wrote:
>
>> paul zhang <paulhuaizhang at gmail.com> writes:
>>
>> > Hi Jed,
>> > Does this mean I've passed the default test?
>>
>> It's an MPI test.  Run this to see if PETSc solvers are running correctly:
>>
>>   make PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2
>> PETSC_ARCH=linux-gnu-intel test
>>
>> > Is the "open matplotlib " an issue?
>>
>> No, it's just a Python library that would be used to create a nice
>> figure if you had it installed.
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141201/813cd0b4/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: OnlyMPI.tar
Type: application/x-tar
Size: 430080 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141201/813cd0b4/attachment-0002.tar>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Petsc-with-MPI.tar
Type: application/x-tar
Size: 10240 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141201/813cd0b4/attachment-0003.tar>


More information about the petsc-users mailing list