[petsc-users] running error
paul zhang
paulhuaizhang at gmail.com
Mon Dec 1 15:40:03 CST 2014
And the MPI and PETSc test with segment fault.
This is the final goal. Many thanks to you Jed.
Paul
Huaibao (Paul) Zhang
*Gas Surface Interactions Lab*
Department of Mechanical Engineering
University of Kentucky,
Lexington,
KY, 40506-0503
*Office*: 216 Ralph G. Anderson Building
*Web*:gsil.engineering.uky.edu
On Mon, Dec 1, 2014 at 4:39 PM, paul zhang <paulhuaizhang at gmail.com> wrote:
> I better send you original files. The compressed files triggered some
> warnings I guess.
> Attached is the MPI test been verified.
>
> Huaibao (Paul) Zhang
> *Gas Surface Interactions Lab*
> Department of Mechanical Engineering
> University of Kentucky,
> Lexington,
> KY, 40506-0503
> *Office*: 216 Ralph G. Anderson Building
> *Web*:gsil.engineering.uky.edu
>
> On Mon, Dec 1, 2014 at 4:33 PM, paul zhang <paulhuaizhang at gmail.com>
> wrote:
>
>> Hi Jed,
>>
>> Now I see PETSc is compiled correctly. However, when I attempted to call
>> "petscksp.h" in my own program (quite simple one), it failed for some
>> reason. Attached you can see two cases. The first is just the test of MPI,
>> which is fine. The second is one added PETSc, which has segment fault as it
>> went to
>>
>> MPI_Comm_rank (MPI_COMM_WORLD, &rank); /* get current
>> process id */
>>
>> Can you shed some light? The MPI version is 1.8.3.
>>
>> Thanks,
>> Paul
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> Huaibao (Paul) Zhang
>> *Gas Surface Interactions Lab*
>> Department of Mechanical Engineering
>> University of Kentucky,
>> Lexington,
>> KY, 40506-0503
>> *Office*: 216 Ralph G. Anderson Building
>> *Web*:gsil.engineering.uky.edu
>>
>> On Mon, Dec 1, 2014 at 4:20 PM, paul zhang <paulhuaizhang at gmail.com>
>> wrote:
>>
>>>
>>> Sorry. I should reply it to the lists.
>>>
>>> [hzh225 at dlxlogin2-2 petsc-3.5.2]$ make
>>> PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2 PETSC_ARCH=linux-gnu-intel
>>> test
>>>
>>> Running test examples to verify correct installation
>>> Using PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2 and
>>> PETSC_ARCH=linux-gnu-intel
>>> C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1
>>> MPI process
>>> C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2
>>> MPI processes
>>> Fortran example src/snes/examples/tutorials/ex5f run successfully with 1
>>> MPI process
>>> Completed test examples
>>> =========================================
>>> Now to evaluate the computer systems you plan use - do:
>>> make PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2
>>> PETSC_ARCH=linux-gnu-intel streams NPMAX=<number of MPI processes you
>>> intend to use>
>>>
>>>
>>> Huaibao (Paul) Zhang
>>> *Gas Surface Interactions Lab*
>>> Department of Mechanical Engineering
>>> University of Kentucky,
>>> Lexington,
>>> KY, 40506-0503
>>> *Office*: 216 Ralph G. Anderson Building
>>> *Web*:gsil.engineering.uky.edu
>>>
>>> On Mon, Dec 1, 2014 at 4:18 PM, Jed Brown <jed at jedbrown.org> wrote:
>>>
>>>> paul zhang <paulhuaizhang at gmail.com> writes:
>>>>
>>>> > Hi Jed,
>>>> > Does this mean I've passed the default test?
>>>>
>>>> It's an MPI test. Run this to see if PETSc solvers are running
>>>> correctly:
>>>>
>>>> make PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2
>>>> PETSC_ARCH=linux-gnu-intel test
>>>>
>>>> > Is the "open matplotlib " an issue?
>>>>
>>>> No, it's just a Python library that would be used to create a nice
>>>> figure if you had it installed.
>>>>
>>>
>>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141201/8e311b3d/attachment-0001.html>
-------------- next part --------------
set (CMAKE_CXX_COMPILER /home/hzh225/LIB_CFD/openmpi-1.8.3/bin/mpiCC)
set (CMAKE_CXX_FLAGS "-O3")
set (PETSC_INCLUDE_DIRS1 /home/hzh225/LIB_CFD/nP/petsc-3.5.2/include)
set (PETSC_INCLUDE_DIRS2 /home/hzh225/LIB_CFD/nP/petsc-3.5.2/linux-gnu-intel/include)
set (PETSC_LIBRARY_DIRS /home/hzh225/LIB_CFD/nP/petsc-3.5.2/linux-gnu-intel/lib)
set (VALGRIND_INCLUDE_DIR /share/cluster/RHEL6.2/x86_64/apps/valgrind/3.9.0/include)
set (VALGRIND_LIBRARY_DIR /share/cluster/RHEL6.2/x86_64/apps/valgrind/3.9.0/lib)
cmake_minimum_required(VERSION 2.6)
project(kats)
set (kats_VERSION_MAJOR 2)
set (kats_VERSION_MINOR 0)
list (APPEND CMAKE_MODULE_PATH "${kats_SOURCE_DIR}/CMake")
# Pass some CMake settings to source code through a header file
configure_file (
"${PROJECT_SOURCE_DIR}/cmake_vars.h.in"
"${PROJECT_BINARY_DIR}/cmake_vars.h"
)
set (CMAKE_INSTALL_PREFIX ${PROJECT_SOURCE_DIR}/../)
# add to the include search path
include_directories("${PROJECT_SOURCE_DIR}")
include_directories(${PETSC_INCLUDE_DIRS1})
include_directories(${PETSC_INCLUDE_DIRS2})
include_directories(${VALGRIND_INCLUDE_DIR})
link_directories(${PETSC_LIBRARY_DIRS})
link_directories(${VALGRIND_LIBRARY_DIR})
set (EXTRA_LIBS petsc)
#add the executable
set (SOURCES
main.cc
cmake_vars.h
)
add_executable(kats ${SOURCES})
target_link_libraries (kats ${EXTRA_LIBS})
install (TARGETS kats RUNTIME DESTINATION bin)
-------------- next part --------------
A non-text attachment was scrubbed...
Name: main.cc
Type: text/x-c++src
Size: 1030 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141201/8e311b3d/attachment-0001.cc>
More information about the petsc-users
mailing list