[petsc-users] PETSc 3.2 gmres error
Matthew Knepley
knepley at gmail.com
Sat Sep 10 08:29:40 CDT 2011
On Sat, Sep 10, 2011 at 6:38 AM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
> On Sat, Sep 10, 2011 at 13:09, Tabrez Ali <stali at geology.wisc.edu> wrote:
>
>> Hello
>>
>> I am running an application using PETSc 3.2 RC on a poor mans cluster at
>> my home (for testing only) which has two nodes running different versions of
>> Debian (they also have different versions of gcc/gfortran) but have the same
>> MPICH2 1.4 and PETSc 3.2 installed on them.
>>
>> Also they do not share the same file system but I make sure that input
>> file/executable paths are exactly same on both machines. After compiling the
>> code separately on the two nodes I launch the parallel program from node 1
>> using mpiexec -f hosts -n 2 .... (hydra process manager).
>>
>> With PETSc 3.1 the application runs fine, both with CG and GMRES and
>> correct output is generated on both nodes.
>>
>> With PETSc 3.2 the application runs fine with CG.
>>
>> But whenever I use GMRES in 3.2 I get an error (listed below) during
>> KSPSolve.
>
>
> Can you reproduce this with any of the examples? For example
>
> cd src/ksp/ksp/examples/tutorials
> make ex2
> mpiexec -f hosts -n 2 ./ex2 -ksp_type gmres
>
> or, to use your matrix, run (any version that works, including 3.1) with
> -ksp_view_binary and then
>
> cd src/ksp/ksp/examples/tutorials
> make ex10
> mpiexec -f hosts -n 2 ./ex10 -f binaryoutput -ksp_type gmres
>
> If these work, there might be memory corruption somewhere in your code
> causing this.
>
>
> You can also run with -start_in_debugger and check what is in the "alpha"
> array on each process.
>
This can happen if a NaN is produced as well. You can easily check by
launching the debugger and
printing the value.
Matt
--
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110910/f17e3dbc/attachment.htm>
More information about the petsc-users
mailing list