[petsc-dev] Fwd: [petsc-maint] [Ext] Re: Question on PC usage with PETSc-3.7
Barry Smith
bsmith at mcs.anl.gov
Thu Apr 6 17:57:16 CDT 2017
Should we try to support a wildcard -*_option xxx Of course we can't use *
Barry
> Begin forwarded message:
>
> From: Federico Golfre' Andreasi <FAndreasi at slb.com>
> Subject: Re: [petsc-maint] [Ext] Re: Question on PC usage with PETSc-3.7
> Date: April 6, 2017 at 5:09:54 AM CDT
> To: Matthew Knepley <knepley at gmail.com>, "petsc-maint at mcs.anl.gov" <petsc-maint at mcs.anl.gov>
>
> Schlumberger-Private
> Hi Matt,
>
> Thank you very much for your reply and sorry for my delay in getting back to you.
>
> The suggestion of using the option “-pc_factor_shift_type nonzero” has solved the problem.
> But I run into another issue, I don’t know if it is intended:
>
> If I run with one single process, than the “-pc_factor_shift_type nonzero” work as expected. Below you can see the output of the PCView:
> PC Object: 1 MPI processes
> type: ilu
> ILU: out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> using diagonal shift to prevent zero pivot [NONZERO]
> matrix ordering: natural
> factor fill ratio given 1., needed 1.
> Factored matrix follows:
> Mat Object: 1 MPI processes
> type: seqaij
> rows=72900, cols=72900
> package used to perform factorization: petsc
> total: nonzeros=355028, allocated nonzeros=355028
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
> linear system matrix = precond matrix:
> Mat Object: 1 MPI processes
> type: seqaij
> rows=72900, cols=72900
> total: nonzeros=355028, allocated nonzeros=355028
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
>
> But when I run with multiple processes (e.g. 4) than I should give the option prepending “sub_”, otherwise I get:
> PC Object: 4 MPI processes
> type: bjacobi
> block Jacobi: number of blocks = 4
> Local solve is same for all blocks, in the following KSP and PC objects:
> KSP Object: (sub_) 1 MPI processes
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> left preconditioning
> using DEFAULT norm type for convergence test
> PC Object: (sub_) 1 MPI processes
> type: ilu
> PC has not been set up so information may be incomplete
> ILU: out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> matrix ordering: natural
> linear system matrix = precond matrix:
> Mat Object: 1 MPI processes
> type: seqaij
> rows=18225, cols=18225
> total: nonzeros=18225, allocated nonzeros=18225
> total number of mallocs used during MatSetValues calls =0
> not using I-node routines
> linear system matrix = precond matrix:
> Mat Object: 4 MPI processes
> type: mpiaij
> rows=72900, cols=72900
> total: nonzeros=355028, allocated nonzeros=355028
> total number of mallocs used during MatSetValues calls =0
> not using I-node (on process 0) routines
>
> Is there a way to set that option for both situations (single and multi-core) with the same option?
>
> Or alternatively, I could call the PCFactorSetShiftType(PPC,MAT_SHIFT_NONZERO,ierr).
> But I didn’t manage to get the reference to the correct PC object in case of multi-core.
>
> I have attached the code snippet if you would like to have a look.
> If you need, I have a dataset (less than 10M) I can send.
>
> Thank you for your support!
> Federico
>
>
>
>
> From: Matthew Knepley [mailto:knepley at gmail.com <mailto:knepley at gmail.com>]
> Sent: Monday, March 27, 2017 3:42 PM
> To: Federico Golfre' Andreasi <FAndreasi at slb.com <mailto:FAndreasi at slb.com>>
> Cc: petsc-maint at mcs.anl.gov <mailto:petsc-maint at mcs.anl.gov>
> Subject: [Ext] Re: [petsc-maint] Question on PC usage with PETSc-3.7
>
> On Mon, Mar 27, 2017 at 6:02 AM, Federico Golfre' Andreasi <FAndreasi at slb.com <mailto:FAndreasi at slb.com>> wrote:
> Schlumberger-Private
> Dear PETSc-dev,
>
> We recently moved some of our code from PETSc-3.4 to PETSc-3.7.3 and we are facing some issue when using the PC object.
>
> We have a piece of FORTRAN code that does the following:
>
> call PCCreate(PETSC_COMM_WORLD, PPC, ierr); CHKERRQ(ierr)
> call PCSetOperators(PPC, H, H,ierr); CHKERRQ(ierr)
> call PCSetOptionsPrefix(PPC, 'ppc_mt_', ierr); CHKERRQ(ierr)
> call PCSetFromOptions(PPC, ierr); CHKERRQ(ierr)
> call PCSetUp(PPC, ierr); CHKERRQ(ierr)
> call PCApply(PPC, gNwork1, gNwork3, ierr); CHKERRQ(ierr)
> call VecNorm(gNwork3,NORM_2,tmpnorm,ierr); CHKERRQ(ierr)
>
> The code is run with the following options:
> -ppc_mt_sub_pc_type ilu
> -ppc_mt_sub_pc_factor_levels 1
> -ppc_mt_sub_pc_factor_fill 2
>
> Now that the code is built against the PETSc-3.7.3, the norm of the output vector of PCApply is NaN.
> Do something have changed in the operations required to set up the PC?
>
> 1) Its always possible this is memory corruption, so run under valgrid
>
> 2) The only thing I can think of that might give a NaN is a zero pivot. I believe we used to default to shifting,
> but now do not (although of course I cannot find this in the Changes file). You can try
>
> -pc_factor_shift_type nonzero
>
> and see if the NaN disappears.
>
> 3) If that does not work, is there a chance you could reproduce with a code we can run?
>
> Thanks,
>
> Matt
>
> I have attached the log file of the configure and make.
>
> Thank you very much for your support,
> Federico
>
>
>
> Federico Golfre’ Andreasi
> Software Engineer Integrated EM Center of Excellence
> Schlumberger Geosolutions
>
> Postal address:
> Schlumberger Italiana Spa
> via Celeste Clericetti 42/A
> 20133 Milano - Italy
>
> +39 02 . 266 . 279 . 249 (direct)
> +39 02 . 266 . 279 . 279 (fax)
> fandreasi at slb.com <mailto:fandreasi at slb.com>
>
>
>
>
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20170406/c7cbfc48/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: main.F90
Type: application/octet-stream
Size: 2704 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20170406/c7cbfc48/attachment.obj>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20170406/c7cbfc48/attachment-0001.html>
More information about the petsc-dev
mailing list