[petsc-users] snes matrix-free jacobian fails with preconditioner shape mismatch

Barry Smith bsmith at petsc.dev
Sun Mar 10 12:44:39 CDT 2024



> On Mar 10, 2024, at 10:16 AM, Yi Hu <y.hu at mpie.de> wrote:
> 
> This Message Is From an External Sender
> This message came from outside your organization.
> Dear Mark,
> 
> Thanks for your reply. I see this mismatch. In fact my global DoF is 324. It seems like I always get the local size = global Dof / np^2, np is my processor number. By the way, I used DMDASNESsetFunctionLocal() to set my form function. Is it eligible to mix DMDASNESsetFunctionLocal() and a native SNESSetJacobian()?
> 

Yes

> Best,
> 
> Yi
> 
> On 3/10/24 13:55, Mark Adams wrote:
>> It looks like your input vector is the global vector, size 162, and the local matrix size is 81.
>> Mark
>> [1]PETSC ERROR: Preconditioner number of local rows 81 does not equal 
>> input vector size 162
>> 
>> On Sun, Mar 10, 2024 at 7:21 AM Yi Hu <y.hu at mpie.de <mailto:y.hu at mpie.de>> wrote:
>>> This Message Is From an External Sender 
>>> This message came from outside your organization. 
>>>  
>>> Dear petsc team,
>>> 
>>> I implemented a matrix-free jacobian, and it can run sequentially. But 
>>> running parallel I got the pc error like this (running with mpirun -np 
>>> 2, only error from rank1 is presented here)
>>> 
>>> [1]PETSC ERROR: --------------------- Error Message 
>>> --------------------------------------------------------------
>>> [1]PETSC ERROR: Nonconforming object sizes
>>> [1]PETSC ERROR: Preconditioner number of local rows 81 does not equal 
>>> input vector size 162
>>> [1]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!ahmistzr4wD3TJ0OvI0JWxB9aVSIbP78Jcs2X_6KMb4LdoR8drLB_DkHvaguhrca22RgFer0PlyUrtdfCA$ for trouble shooting.
>>> [1]PETSC ERROR: Petsc Release Version 3.17.3, Jun 29, 2022
>>> [1]PETSC ERROR: /home/yi/workspace/DAMASK_yi/bin/DAMASK_grid on a 
>>> arch-linux-c-opt named carbon-x1 by yi Sun Mar 10 12:01:46 2024
>>> [1]PETSC ERROR: Configure options --download-fftw --download-hdf5 
>>> --with-hdf5-fortran-bindings --download-fblaslapack --download-chaco 
>>> --download-hypre --download-metis --download-mumps --download-parmetis 
>>> --download-scalapack --download-suitesparse --download-superlu 
>>> --download-superlu_dist --download-triangle --download-zlib 
>>> --download-cmake --with-cxx-dialect=C++11 --with-c2html=0 
>>> --with-debugging=0 --with-ssl=0 --with-x=0 COPTFLAGS=-O3 CXXOPTFLAGS=-O3 
>>> FOPTFLAGS=-O3
>>> [1]PETSC ERROR: #1 PCApply() at 
>>> /home/yi/App/petsc-3.17.3/src/ksp/pc/interface/precon.c:424
>>> [1]PETSC ERROR: #2 KSP_PCApply() at 
>>> /home/yi/App/petsc-3.17.3/include/petsc/private/kspimpl.h:376
>>> [1]PETSC ERROR: #3 KSPInitialResidual() at 
>>> /home/yi/App/petsc-3.17.3/src/ksp/ksp/interface/itres.c:64
>>> [1]PETSC ERROR: #4 KSPSolve_GMRES() at 
>>> /home/yi/App/petsc-3.17.3/src/ksp/ksp/impls/gmres/gmres.c:242
>>> [1]PETSC ERROR: #5 KSPSolve_Private() at 
>>> /home/yi/App/petsc-3.17.3/src/ksp/ksp/interface/itfunc.c:902
>>> [1]PETSC ERROR: #6 KSPSolve() at 
>>> /home/yi/App/petsc-3.17.3/src/ksp/ksp/interface/itfunc.c:1078
>>> [1]PETSC ERROR: #7 SNESSolve_NEWTONLS() at 
>>> /home/yi/App/petsc-3.17.3/src/snes/impls/ls/ls.c:222
>>> [1]PETSC ERROR: #8 SNESSolve() at 
>>> /home/yi/App/petsc-3.17.3/src/snes/interface/snes.c:4756
>>> [1]PETSC ERROR: #9 User provided function() at User file:0 <file:///0>
>>> 
>>> However, from snes matrix-free documentation 
>>> (https://urldefense.us/v3/__https://petsc.org/release/manual/snes/*matrix-free-methods__;Iw!!G_uCfscf7eWS!ahmistzr4wD3TJ0OvI0JWxB9aVSIbP78Jcs2X_6KMb4LdoR8drLB_DkHvaguhrca22RgFer0Ply6ZbywOw$), it is said 
>>> matrix-free is used with pcnone. So I assume it would not apply 
>>> preconditioner, but it did use preconditioning probably the same as my 
>>> matrix-free shell matrix. Here is how i initialize my shell matrix and 
>>> the corresponding customized multiplication.
>>> 
>>>    call MatCreateShell(PETSC_COMM_WORLD,PETSC_DECIDE,PETSC_DECIDE,&
>>> int(9*product(cells(1:2))*cells3,pPETSCINT),&
>>> int(9*product(cells(1:2))*cells3,pPETSCINT),&
>>>                        F_PETSc,Jac_PETSc,err_PETSc)
>>>    call MatShellSetOperation(Jac_PETSc,MATOP_MULT,GK_op,err_PETSc)
>>>    call SNESSetDM(SNES_mech,DM_mech,err_PETSc)
>>>    call 
>>> SNESSetJacobian(SNES_mech,Jac_PETSc,Jac_PETSc,PETSC_NULL_FUNCTION,0,err_PETSc)
>>>    call SNESGetKSP(SNES_mech,ksp,err_PETSc)
>>>    call PCSetType(pc,PCNONE,err_PETSc)
>>> 
>>> And my GK_op is like
>>> 
>>> subroutine GK_op(Jac,dF_global,output_local,err_PETSc)
>>> 
>>>    DM                                   :: dm_local
>>>    Vec                                  :: dF_global, dF_local, output_local
>>>    Mat                                  :: Jac
>>>    PetscErrorCode                       :: err_PETSc
>>> 
>>>    real(pREAL), pointer,dimension(:,:,:,:) :: dF_scal, output_scal
>>> 
>>>    real(pREAL), dimension(3,3,cells(1),cells(2),cells3) :: &
>>>      dF
>>>    real(pREAL), dimension(3,3,cells(1),cells(2),cells3) :: &
>>>      output
>>> 
>>>    call SNESGetDM(SNES_mech,dm_local,err_PETSc)
>>> 
>>>    call DMGetLocalVector(dm_local,dF_local,err_PETSc)
>>>    call 
>>> DMGlobalToLocalBegin(dm_local,dF_global,INSERT_VALUES,dF_local,err_PETSc)
>>>    call 
>>> DMGlobalToLocalEnd(dm_local,dF_global,INSERT_VALUES,dF_local,err_PETSc)
>>> 
>>>    call DMDAVecGetArrayReadF90(dm_local,dF_local,dF_scal,err_PETSc)
>>>    dF = reshape(dF_scal, [3,3,cells(1),cells(2),cells3])
>>> 
>>> .......
>>> 
>>> 
>>>    call DMDAVecRestoreArrayF90(dm_local,output_local,output_scal,err_PETSc)
>>>    CHKERRQ(err_PETSc)
>>> 
>>>    call DMDAVecRestoreArrayF90(dm_local,dF_local,dF_scal,err_PETSc)
>>>    CHKERRQ(err_PETSc)
>>> 
>>> end subroutine GK_op
>>> 
>>> I checked my cells3, it corresponds to my local size, and it seems the 
>>> local size of dF_local is ok.
>>> 
>>> I am a bit lost here to find the reason for the preconditioner bug. 
>>> Could you help me on this? Thanks.
>>> 
>>> Best regards,
>>> 
>>> Yi
>>> 
>>> 
>>> 
>>> 
>>> -------------------------------------------------
>>> Stay up to date and follow us on LinkedIn, Twitter and YouTube.
>>> 
>>> Max-Planck-Institut für Eisenforschung GmbH
>>> Max-Planck-Straße 1
>>> D-40237 Düsseldorf
>>>  
>>> Handelsregister B 2533 
>>> Amtsgericht Düsseldorf
>>>  
>>> Geschäftsführung
>>> Prof. Dr. Gerhard Dehm
>>> Prof. Dr. Jörg Neugebauer
>>> Prof. Dr. Dierk Raabe
>>> Dr. Kai de Weldige
>>>  
>>> Ust.-Id.-Nr.: DE 11 93 58 514 
>>> Steuernummer: 105 5891 1000
>>> 
>>> 
>>> Please consider that invitations and e-mails of our institute are 
>>> only valid if they end with …@mpie.de <https://urldefense.us/v3/__http://mpie.de__;!!G_uCfscf7eWS!eslPb6ZEnby8M7NKRe_U_RT-95w2J8O2ngsc8rmrQVlMpEcB6ZTKlN2g65crXCWv7D2F9ubkzYI5nDpoeQ$>. 
>>> If you are not sure of the validity please contact rco at mpie.de <mailto:rco at mpie.de>
>>> 
>>> Bitte beachten Sie, dass Einladungen zu Veranstaltungen und E-Mails
>>> aus unserem Haus nur mit der Endung …@mpie.de <https://urldefense.us/v3/__http://mpie.de__;!!G_uCfscf7eWS!eslPb6ZEnby8M7NKRe_U_RT-95w2J8O2ngsc8rmrQVlMpEcB6ZTKlN2g65crXCWv7D2F9ubkzYI5nDpoeQ$> gültig sind. 
>>> In Zweifelsfällen wenden Sie sich bitte an rco at mpie.de <mailto:rco at mpie.de>
>>> -------------------------------------------------
>>> 
> 
> 
> -------------------------------------------------
> Stay up to date and follow us on LinkedIn, Twitter and YouTube.
> 
> Max-Planck-Institut für Eisenforschung GmbH
> Max-Planck-Straße 1
> D-40237 Düsseldorf
>  
> Handelsregister B 2533 
> Amtsgericht Düsseldorf
>  
> Geschäftsführung
> Prof. Dr. Gerhard Dehm
> Prof. Dr. Jörg Neugebauer
> Prof. Dr. Dierk Raabe
> Dr. Kai de Weldige
>  
> Ust.-Id.-Nr.: DE 11 93 58 514 
> Steuernummer: 105 5891 1000
> 
> 
> Please consider that invitations and e-mails of our institute are 
> only valid if they end with …@mpie.de. 
> If you are not sure of the validity please contact rco at mpie.de <mailto:rco at mpie.de>
> 
> Bitte beachten Sie, dass Einladungen zu Veranstaltungen und E-Mails
> aus unserem Haus nur mit der Endung …@mpie.de gültig sind. 
> In Zweifelsfällen wenden Sie sich bitte an rco at mpie.de <mailto:rco at mpie.de>
> -------------------------------------------------

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240310/090e1ddb/attachment-0001.html>


More information about the petsc-users mailing list