[petsc-users] PCMG with shell restrictions

Matteo Semplice matteo.semplice at uninsubria.it
Thu Dec 12 05:22:02 CST 2024


Hi. Indeed creating a vec like in DMCreateInterpolationScale does the job.

Thank you!

Matteo & Samuele

On 10/12/2024 17:27, Barry Smith wrote:

>
>    Beats me :-) Just set it to whatever the DMDA sets it too
>
>   Barry
>
>
>> On Dec 10, 2024, at 11:20 AM, Matteo Semplice 
>> <matteo.semplice at uninsubria.it> wrote:
>>
>> Thanks, Barry.
>>
>> Just to be sure, one should refer to the help of 
>> https://urldefense.us/v3/__https://petsc.org/release/manualpages/DM/DMCreateInterpolationScale/__;!!G_uCfscf7eWS!ZIJKHA2S0bAQI-Aea29FNqC4ZzzTaTqYr_2n6i3DkrdUoub4yRrIlmdsYpESo1wSqyGMK4FAk1NBt_WyJvQANYWMxdSgl5M7s1pT4w$  
>> and the vec should be set to the point-wise inverse of 
>> (INTERPOLATION)*(vector of ones)  ?
>>
>> Matteo
>>
>> On 10/12/2024 16:38, Barry Smith wrote:
>>>
>>>    It appears you are completely ignoring the vec argument? Take a 
>>> look at, for example, DMCreateInterpolation_DA() you will see you 
>>> need to provide an appropriate vec in the same way you need (and do) 
>>> provide an appropriate mat.
>>>
>>>    Barry
>>>
>>>
>>>> On Dec 10, 2024, at 7:02 AM, Matteo Semplice via petsc-users 
>>>> <petsc-users at mcs.anl.gov> wrote:
>>>>
>>>> Dear petsc-users,
>>>>
>>>>     I am trying with a student to modify the MG example 65 to use 
>>>> mat-shells instead of assembled matrices. Our use case is for a 
>>>> method that will use custom shell operators and shell 
>>>> interpolation/restrictions.
>>>>
>>>> To start we have modified ex65 and tried to replace the standard 
>>>> restriction/interpolation with a shell matrix that performs the 
>>>> same operations on the DMDA grids.
>>>>
>>>> I attach our modifications to ex65.
>>>>
>>>> The problem is that the code sometimes completes execution and 
>>>> sometimes errors out like
>>>>
>>>> $ ./ex65shell -ksp_monitor -pc_type mg -da_refine 2 -ksp_rtol 1e-1
>>>> >> Created DMshell 0x55bbc83c91a0 (0x55bbc8390a80)
>>>> Calling KSPSolve from main
>>>> computeRHS on grid 513
>>>> computeMatrix on grid 513
>>>> Inside Coarsen
>>>> >> Created DMshell 0x55bbc84c5270 (0x55bbc84b2ff0)
>>>> Inside Coarsen
>>>> >> Created DMshell 0x55bbc84e0a30 (0x55bbc84bdb60)
>>>> >> Create interpolation from 0x55bbc84c5270(0x55bbc84b2ff0) to 
>>>> 0x55bbc83c91a0(0x55bbc8390a80)
>>>> [0]PETSC ERROR: --------------------- Error Message 
>>>> --------------------------------------------------------------
>>>> [0]PETSC ERROR: Invalid pointer
>>>> [0]PETSC ERROR: Invalid Pointer to PetscObject: Argument 'obj' 
>>>> (parameter # 1)
>>>> [0]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!ZIJKHA2S0bAQI-Aea29FNqC4ZzzTaTqYr_2n6i3DkrdUoub4yRrIlmdsYpESo1wSqyGMK4FAk1NBt_WyJvQANYWMxdSgl5MFmiZ_RA$  for trouble 
>>>> shooting.
>>>> [0]PETSC ERROR: Petsc Release Version 3.22.0, unknown
>>>> [0]PETSC ERROR: ./ex65shell with 1 MPI process(es) and PETSC_ARCH 
>>>>  on signalkuppe by matteo Tue Dec 10 12:55:12 2024
>>>> [0]PETSC ERROR: Configure options: 
>>>> --prefix=/home/matteo/software/petscsaved/3.22-opt/ 
>>>> PETSC_DIR=/home/matteo/software/petsc --PETSC_ARCH=dbg 
>>>> --with-debugging=1 --with-st
>>>> rict-petscerrorcode --download-hdf5 --download-ml --with-metis 
>>>> --with-parmetis --with-gmsh --with-triangle --with-zlib 
>>>> --with-p4est-dir=~/software/p4est/local/
>>>> [0]PETSC ERROR: #1 PetscObjectReference() at 
>>>> /home/matteo/software/petsc/src/sys/objects/inherit.c:620
>>>> [0]PETSC ERROR: #2 PCMGSetRScale() at 
>>>> /home/matteo/software/petsc/src/ksp/pc/impls/mg/mgfunc.c:394
>>>> [0]PETSC ERROR: #3 PCSetUp_MG() at 
>>>> /home/matteo/software/petsc/src/ksp/pc/impls/mg/mg.c:998
>>>> [0]PETSC ERROR: #4 PCSetUp() at 
>>>> /home/matteo/software/petsc/src/ksp/pc/interface/precon.c:1071
>>>> [0]PETSC ERROR: #5 KSPSetUp() at 
>>>> /home/matteo/software/petsc/src/ksp/ksp/interface/itfunc.c:415
>>>> [0]PETSC ERROR: #6 KSPSolve_Private() at 
>>>> /home/matteo/software/petsc/src/ksp/ksp/interface/itfunc.c:826
>>>> [0]PETSC ERROR: #7 KSPSolve() at 
>>>> /home/matteo/software/petsc/src/ksp/ksp/interface/itfunc.c:1075
>>>> [0]PETSC ERROR: #8 main() at ../src/ex65shell.c:79
>>>> [0]PETSC ERROR: PETSc Option Table entries:
>>>> [0]PETSC ERROR: -da_refine 2 (source: command line)
>>>> [0]PETSC ERROR: -ksp_monitor (source: command line)
>>>> [0]PETSC ERROR: -ksp_rtol 1e-1 (source: command line)
>>>> [0]PETSC ERROR: -pc_type mg (source: command line)
>>>> [0]PETSC ERROR: ----------------End of Error Message -------send 
>>>> entire error message to petsc-maint at mcs.anl.gov----------
>>>>
>>>> In all cases valgrind complains like
>>>>
>>>> $ valgrind ./ex65shell -ksp_monitor -pc_type mg -da_refine 2 
>>>> -ksp_rtol 1e-1 -int_view ascii::ascii_info
>>>> ==2130767== Memcheck, a memory error detector
>>>> ==2130767== Copyright (C) 2002-2022, and GNU GPL'd, by Julian 
>>>> Seward et al.
>>>> ==2130767== Using Valgrind-3.19.0 and LibVEX; rerun with -h for 
>>>> copyright info
>>>> ==2130767== Command: ./ex65shell -ksp_monitor -pc_type mg 
>>>> -da_refine 2 -ksp_rtol 1e-1 -int_view ascii::ascii_info
>>>> ==2130767==
>>>> hwloc x86 backend cannot work under Valgrind, disabling.
>>>> May be reenabled by dumping CPUIDs with hwloc-gather-cpuid
>>>> and reloading them under Valgrind with HWLOC_CPUID_PATH.
>>>> >> Created DMshell 0xfbf50f0 (0xfb49480)
>>>> Calling KSPSolve from main
>>>> computeRHS on grid 513
>>>> computeMatrix on grid 513
>>>> Inside Coarsen
>>>> >> Created DMshell 0xff404a0 (0xff062d0)
>>>> Inside Coarsen
>>>> >> Created DMshell 0xff78ca0 (0xff4a530)
>>>> >> Create interpolation from 0xff404a0(0xff062d0) to 
>>>> 0xfbf50f0(0xfb49480)
>>>> Mat Object: 1 MPI process
>>>>  type: shell
>>>>  rows=513, cols=257
>>>> ==2130767== Conditional jump or move depends on uninitialised value(s)
>>>> ==2130767==    at 0x83FDAAE: PCSetUp_MG (mg.c:998)
>>>> ==2130767==    by 0x8620C04: PCSetUp (precon.c:1071)
>>>> ==2130767==    by 0x7DA78D5: KSPSetUp (itfunc.c:415)
>>>> ==2130767==    by 0x7DB2093: KSPSolve_Private (itfunc.c:826)
>>>> ==2130767==    by 0x7DB7A53: KSPSolve (itfunc.c:1075)
>>>> ==2130767==    by 0x10D2F7: main (ex65shell.c:79)
>>>> ==2130767==
>>>> ==2130767== Conditional jump or move depends on uninitialised value(s)
>>>> ==2130767==    at 0x5794061: VecDestroy (vector.c:570)
>>>> ==2130767==    by 0x83FDC36: PCSetUp_MG (mg.c:999)
>>>> ==2130767==    by 0x8620C04: PCSetUp (precon.c:1071)
>>>> ==2130767==    by 0x7DA78D5: KSPSetUp (itfunc.c:415)
>>>> ==2130767==    by 0x7DB2093: KSPSolve_Private (itfunc.c:826)
>>>> ==2130767==    by 0x7DB7A53: KSPSolve (itfunc.c:1075)
>>>> ==2130767==    by 0x10D2F7: main (ex65shell.c:79)
>>>> ==2130767==
>>>>
>>>> We are clearly doing something wrong since PCSetUp_MG (mg.c:998)  
>>>> is an area of code where I wouldn't expect we would enter.
>>>>
>>>> Can you advise on the proper way to achieve our goal?
>>>>
>>>> Best regards
>>>>
>>>>     Matteo
>>>>
>>>> <ex65.patch>
>>>
>> -- 
>> Prof. Matteo Semplice
>> Università degli Studi dell’Insubria
>> Dipartimento di Scienza e Alta Tecnologia – DiSAT
>> Professore Associato
>> Via Valleggio, 11 – 22100 Como (CO) – Italia
>> tel.: +39 031 2386316
>
-- 
Prof. Matteo Semplice
Università degli Studi dell’Insubria
Dipartimento di Scienza e Alta Tecnologia – DiSAT
Professore Associato
Via Valleggio, 11 – 22100 Como (CO) – Italia
tel.: +39 031 2386316
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20241212/2444b5cb/attachment-0001.html>


More information about the petsc-users mailing list