[petsc-users] PETSc initialization error

Junchao Zhang junchao.zhang at gmail.com
Sat Jun 20 13:40:25 CDT 2020


Sam,
   There are more problems. I am working on a fix.  Please wait an hour.
 Thanks.

--Junchao Zhang


On Sat, Jun 20, 2020 at 1:12 PM Sam Guo <sam.guo at cd-adapco.com> wrote:

> Junchao,
>    I debugged: MPI_Finalize is not called for serial.
>
> Barry,
>    I tried your patch and it seems better but eventually got following
> error:
>
> [0]PETSC ERROR: #1 PetscCommDuplicate() line 160 in
> ../../../petsc/src/sys/objects/tagm.c
> [0]PETSC ERROR: #2 PetscHeaderCreate_Private() line 64 in
> ../../../petsc/src/sys/objects/inherit.c
> [0]PETSC ERROR: #3 MatCreate() line 91 in
> ../../../petsc/src/mat/utils/gcreate.c
> [0]PETSC ERROR: #4 MatCreateShell() line 787 in
> ../../../petsc/src/mat/impls/shell/shell.c
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Null argument, when expecting valid pointer
> [0]PETSC ERROR: Null Object: Parameter # 1
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.11.3, Jun, 26, 2019
> [0]PETSC ERROR: #6 MatShellSetOperation() line 1052 in
> ../../../petsc/src/mat/impls/shell/shell.c
>
> On Sat, Jun 20, 2020 at 10:24 AM Barry Smith <bsmith at petsc.dev> wrote:
>
>>
>>    Junchao,
>>
>>      This is a good bug fix. It solves the problem when PETSc initialize
>> is called many times.
>>
>>      There is another fix you can do to limit PETSc mpiuni running out of
>> attributes inside a single PETSc run:
>>
>>
>> int MPI_Comm_create_keyval(MPI_Copy_function *copy_fn,MPI_Delete_function
>> *delete_fn,int *keyval,void *extra_state)
>> {
>>
>>  if (num_attr >= MAX_ATTR){
>>    for (i=0; i<num_attr; i++) {
>>      if (!attr_keyval[i].extra_state) {
>>         /* reuse this slot */
>>         attr_keyval[i].extra_state = extra_state;
>>        attr_keyval[i.]del         = delete_fn;
>>        *keyval = i;
>>         return MPI_SUCCESS;
>>      }
>>   }
>>   return MPIUni_Abort(MPI_COMM_WORLD,1);
>> }
>>  return MPIUni_Abort(MPI_COMM_WORLD,1);
>>   attr_keyval[num_attr].extra_state = extra_state;
>>   attr_keyval[num_attr].del         = delete_fn;
>>   *keyval                           = num_attr++;
>>   return MPI_SUCCESS;
>> }
>>
>>   This will work if the user creates tons of attributes but is constantly
>> deleting some as they new ones. So long as the number outstanding at one
>> time is < MAX_ATTR)
>>
>> Barry
>>
>>
>>
>>
>>
>> On Jun 20, 2020, at 10:54 AM, Junchao Zhang <junchao.zhang at gmail.com>
>> wrote:
>>
>> I don't understand what your session means. Let's try this patch
>>
>> diff --git a/src/sys/mpiuni/mpi.c b/src/sys/mpiuni/mpi.c
>> index d559a513..c058265d 100644
>> --- a/src/sys/mpiuni/mpi.c
>> +++ b/src/sys/mpiuni/mpi.c
>> @@ -283,6 +283,7 @@ int MPI_Finalize(void)
>>    MPI_Comm_free(&comm);
>>    comm = MPI_COMM_SELF;
>>    MPI_Comm_free(&comm);
>> +  num_attr = 1; /* reset the counter */
>>    MPI_was_finalized = 1;
>>    return MPI_SUCCESS;
>>  }
>>
>>
>> --Junchao Zhang
>>
>>
>> On Sat, Jun 20, 2020 at 10:48 AM Sam Guo <sam.guo at cd-adapco.com> wrote:
>>
>>> Typo: I mean “Assuming initializer is only needed once for entire
>>> session”
>>>
>>> On Saturday, June 20, 2020, Sam Guo <sam.guo at cd-adapco.com> wrote:
>>>
>>>> Assuming finalizer is only needed once for entire session(?), I can put
>>>> initializer into the static block to call it once but where do I call
>>>> finalizer?
>>>>
>>>>
>>>> On Saturday, June 20, 2020, Junchao Zhang <junchao.zhang at gmail.com>
>>>> wrote:
>>>>
>>>>> The counter num_attr should be recycled. But first try to call PETSc
>>>>> initialize/Finalize only once to see it fixes the error.
>>>>> --Junchao Zhang
>>>>>
>>>>>
>>>>> On Sat, Jun 20, 2020 at 12:48 AM Sam Guo <sam.guo at cd-adapco.com>
>>>>> wrote:
>>>>>
>>>>>> To clarify, I call PETSc initialize and PETSc finalize everytime I
>>>>>> call SLEPc:
>>>>>>
>>>>>>   PetscInitializeNoPointers(argc,args,nullptr,nullptr);
>>>>>>
>>>>>>   SlepcInitialize(&argc,&args,static_cast<char*>(nullptr),help);
>>>>>>
>>>>>>   //calling slepc
>>>>>>
>>>>>>   SlepcFinalize();
>>>>>>
>>>>>>    PetscFinalize();
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Jun 19, 2020 at 10:32 PM Sam Guo <sam.guo at cd-adapco.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Dear PETSc team,
>>>>>>>    When I called SLEPc multiple time, I eventually got following
>>>>>>> error:
>>>>>>>
>>>>>>> MPI operation not supported by PETSc's sequential MPI wrappers
>>>>>>> [0]PETSC ERROR: #1 PetscInitialize() line 967 in
>>>>>>> ../../../petsc/src/sys/objects/pinit.c
>>>>>>> [0]PETSC ERROR: #2 SlepcInitialize() line 262 in
>>>>>>> ../../../slepc/src/sys/slepcinit.c
>>>>>>> [0]PETSC ERROR: #3 SlepcInitializeNoPointers() line 359 in
>>>>>>> ../../../slepc/src/sys/slepcinit.c
>>>>>>> PETSC ERROR: Logging has not been enabled.
>>>>>>> You might have forgotten to call PetscInitialize().
>>>>>>>
>>>>>>>   I debugged: it is because of following in
>>>>>>> petsc/src/sys/mpiuni/mpi.c
>>>>>>>
>>>>>>> if (num_attr >= MAX_ATTR)
>>>>>>>
>>>>>>> in function int MPI_Comm_create_keyval(MPI_Copy_function
>>>>>>> *copy_fn,MPI_Delete_function *delete_fn,int *keyval,void *extra_state)
>>>>>>>
>>>>>>> num_attr is declared static and keeps increasing every
>>>>>>> time MPI_Comm_create_keyval is called.
>>>>>>>
>>>>>>> I am using petsc 3.11.3 but found 3.13.2 has the same logic.
>>>>>>>
>>>>>>> Is this a bug or I didn't use it correctly?
>>>>>>>
>>>>>>> Thanks,
>>>>>>> Sam
>>>>>>>
>>>>>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200620/03417492/attachment.html>


More information about the petsc-users mailing list