MPICH error when using petsc-3.0.0-p4
Barry Smith
bsmith at mcs.anl.gov
Mon Apr 13 20:25:46 CDT 2009
The -r8 option is a dangerous beast, we highly recommend PETSc
users (and everyone else) NOT
use this option. If you use PetscScalar then you don't need the -r
option.
Barry
BTW: are you sure that an implicit none at the top of a module
applies to each subroutine inside the module?
On Apr 13, 2009, at 8:19 PM, Wee-Beng TAY wrote:
> Hi Satish,
>
> That's strange. This is because I initially declared u_ast as
>
> real(8), allocatable :: u_ast(:,:)
>
> I also have the -r8 option enabled. Any idea what is going on?
>
> Thank you very much and have a nice day!
>
> Yours sincerely,
>
> Wee-Beng Tay
>
>
>
> Satish Balay wrote:
>> (--with-scalar-type=[real,complex], --with-precision=[single,double])
>>
>> Depending upon how petsc is configured - PetscScalar can be either
>> real4,real8,complex8. With most common usage - [i.e default build]
>> PetscScalar is real8.
>>
>> Earlier you claimed: MPI_ISEND(v_ast) worked with MPI_REAL but not
>> MPI_REAL8. This sugested that v_ast was declared as real4 - not
>> real8.
>>
>> Satish
>>
>> On Tue, 14 Apr 2009, Wee-Beng TAY wrote:
>>
>>
>>> Hi,
>>>
>>> So when must I use PetscScalar, and when can I just use real(8)?
>>> This is
>>> because most of my variables are in real(8), except for the Mat,
>>> Vec, required
>>> in the solving of the linear eqn. Is there a rule? I initially
>>> thought they
>>> are the same and if I remember correctly, it seems to work fine in
>>> PETSc2.3.3.
>>>
>>> My v_ast is now defined as:
>>>
>>> PetscScalar, allocatable :: u_ast(:,:), v_ast(:,:)
>>>
>>> allocate (u_ast(0:size_x+1,jsta_ext:jend_ext),
>>> STAT=status(1));allocate
>>> (v_ast(0:size_x+1,jsta_ext:jend_ext), STAT=status(2))
>>>
>>> if (status(1)/=0 .or. status(2)/=0) STOP "Cannot allocate memory"
>>>
>>> Thank you very much and have a nice day!
>>>
>>> Yours sincerely,
>>>
>>> Wee-Beng Tay
>>>
>>>
>>>
>>> Satish Balay wrote:
>>>
>>>> And if v_ast() is used with PETSc - then it should be defined
>>>> 'PetscScalar'
>>>>
>>>> and then you use MPI_ISEND(....,MPIU_SCALAR,...)
>>>>
>>>> Satish
>>>>
>>>>
>>>> On Mon, 13 Apr 2009, Barry Smith wrote:
>>>>
>>>>
>>>>> Where is your implicit none that all sane programmers begin their
>>>>> Fortran
>>>>> subroutines with?
>>>>>
>>>>>
>>>>> On Apr 13, 2009, at 11:11 AM, Wee-Beng TAY wrote:
>>>>>
>>>>>
>>>>>> Hi Satish,
>>>>>>
>>>>>> Compiling and building now worked without error. However, when
>>>>>> I run, I
>>>>>> get
>>>>>> the error:
>>>>>>
>>>>>> 0 - MPI_ISEND : Datatype is MPI_TYPE_NULL
>>>>>> [0] Aborting program !
>>>>>> [0] Aborting program!
>>>>>> Error 323, process 0, host GOTCHAMA-E73BB3:
>>>>>>
>>>>>> The problem lies in this subroutine:
>>>>>>
>>>>>> subroutine v_ast_row_copy
>>>>>>
>>>>>> #include "finclude/petsc.h"
>>>>>> #include "finclude/petscvec.h"
>>>>>> #include "finclude/petscmat.h"
>>>>>> #include "finclude/petscksp.h"
>>>>>> #include "finclude/petscpc.h"
>>>>>> #include "finclude/petscsys.h"
>>>>>>
>>>>>> !to copy data of jend row to others
>>>>>>
>>>>>> integer ::
>>>>>> inext,iprev,istatus(MPI_STATUS_SIZE),irecv1,ierr,isend1
>>>>>>
>>>>>> inext = myid + 1
>>>>>> iprev = myid - 1
>>>>>>
>>>>>> if (myid == num_procs - 1) inext = MPI_PROC_NULL
>>>>>>
>>>>>> if (myid == 0) iprev = MPI_PROC_NULL
>>>>>>
>>>>>> CALL
>>>>>> MPI_ISEND(v_ast(1,jend),size_x,MPI_REAL8,inext,
>>>>>> 1,MPI_COMM_WORLD,isend1,ierr)
>>>>>> CALL
>>>>>> MPI_IRECV(v_ast(1,jsta-1),size_x,MPI_REAL8,iprev,
>>>>>> 1,MPI_COMM_WORLD,irecv1,ierr)
>>>>>> CALL MPI_WAIT(isend1, istatus, ierr)
>>>>>> CALL MPI_WAIT(irecv1, istatus, ierr)
>>>>>>
>>>>>> end subroutine v_ast_row_copy
>>>>>>
>>>>>>
>>>>>> I copied this subroutine from the RS6000 mpi manual and it used
>>>>>> to work.
>>>>>> I
>>>>>> wonder if this is a MPI or PETSc problem? Strange because I
>>>>>> already
>>>>>> specify
>>>>>> the type to be MPI_REAL8. However changing to MPI_REAL solves the
>>>>>> problem.
>>>>>>
>>>>>> If this is a MPI problem, then you can just ignore it. I'll
>>>>>> check it in
>>>>>> some
>>>>>> MPI forum.
>>>>>>
>>>>>>
>>>>>> Thank you very much and have a nice day!
>>>>>>
>>>>>> Yours sincerely,
>>>>>>
>>>>>> Wee-Beng Tay
>>>>>>
>>
>>
More information about the petsc-users
mailing list