[petsc-dev] Bug in petsc-dev?
Thomas Witkowski
Thomas.Witkowski at tu-dresden.de
Tue Mar 22 12:50:40 CDT 2011
Zitat von Barry Smith <bsmith at mcs.anl.gov>:
>
> On Mar 22, 2011, at 11:08 AM, Thomas Witkowski wrote:
>
>> Could some of you test the very small attached example? I make use
>> of the current petsc-dev, OpenMPI 1.4.1 and GCC 4.2.4. In this
>> environment, using 4 nodes, I get the following output, which is
>> wrong:
>>
>> [3] BCAST-RESULT: 812855920
>> [2] BCAST-RESULT: 450385
>> [1] BCAST-RESULT: 450385
>> [0] BCAST-RESULT: 450385
>>
>> The problem occurs only when I run the code on different nodes.
>> When I start mpirun on only one node with four threads
>
> You mean 4 MPI processes?
Yes.
>
>
>> or I make use of a four core system, everything is fine. valgrind
>> and Allinea DDT, both say that everything is fine. So I'm really
>> not sure where the problem is. Using PETSc 3.1-p8 there is no
>> problem with this example. Would be quite interesting to know if
>> some of you can reproduce this problem or not. Thanks for any try!
>
> Replace the PetscInitialize() and PetscFinalize() with MPI_Init()
> and MPI_Finalize() and remove the include petsc.h now link under
> old and new PETSc and run under the different systems.
>
> I'm thinking you'll still get the wrong result without the Petsc
> calls indicating that it is an MPI issue.
No! When I already did this test. In this case I get the correct results!
Thomas
>
> Barry
>
>>
>> Thomas
>>
>> <test.c>
>
>
>
More information about the petsc-dev
mailing list