[petsc-users] null space problem with -pc_type lu on single proc
Klaij, Christiaan
C.Klaij at marin.nl
Thu Nov 7 08:42:27 CST 2024
Thanks for explaining, Stefano. I'm using superlu so -pc_factor_mat_solver_type superlu_dist did the trick.
Chris
________________________________________
From: Stefano Zampini <stefano.zampini at gmail.com>
Sent: Thursday, November 7, 2024 2:20 PM
To: Klaij, Christiaan
Cc: petsc-users at mcs.anl.gov
Subject: Re: [petsc-users] null space problem with -pc_type lu on single proc
the default LU solver in sequential is the PETSc one which does not support pivoting or singular problems. In parallel, it is either MUMPS or SUPERLU_DIST, depending on your configurations.
MUMPS for example can handle singular problem, not sure about superlu_dist. You can run the parallel version with -ksp_view and see what is the solver package used.
Supposing it is mumps, you can run the sequential code with -pc_factor_mat_solver_type mumps
Il giorno gio 7 nov 2024 alle ore 13:20 Klaij, Christiaan via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> ha scritto:
I'm trying to solve a system with a single, non-constant, null space vector, confirmed by passing the MatNullSpaceTest. Solving the system works fine with -pc_type ilu on one or multiple procs. It also works fine with -pc_type lu on multiple procs but fails on a single proc. Any idea what could be wrong?
Chris
[0]PETSC ERROR: *** unknown floating point error occurred ***
[0]PETSC ERROR: The specific exception can be determined by running in a debugger. When the
[0]PETSC ERROR: debugger traps the signal, the exception can be found with fetestexcept(0x3f)
[0]PETSC ERROR: where the result is a bitwise OR of the following flags:
[0]PETSC ERROR: FE_INVALID=0x1 FE_DIVBYZERO=0x4 FE_OVERFLOW=0x8 FE_UNDERFLOW=0x10 FE_INEXACT=0x20
[0]PETSC ERROR: Try option -start_in_debugger
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: --------------------- Stack Frames ------------------------------------
[0]PETSC ERROR: The line numbers in the error traceback are not always exact.
[0]PETSC ERROR: #1 PetscDefaultFPTrap() at /cm/shared/apps/petsc/oneapi/build/src/src/sys/error/fp.c:487
[0]PETSC ERROR: #2 VecMDot_Seq() at /cm/shared/apps/petsc/oneapi/build/src/src/vec/vec/impls/seq/dvec2.c:189
[0]PETSC ERROR: #3 VecMXDot_MPI_Default() at /cm/shared/apps/petsc/oneapi/build/src/include/../src/vec/vec/impls/mpi/pvecimpl.h:96
[0]PETSC ERROR: #4 VecMDot_MPI() at /cm/shared/apps/petsc/oneapi/build/src/src/vec/vec/impls/mpi/pvec2.c:25
[0]PETSC ERROR: #5 VecMXDot_Private() at /cm/shared/apps/petsc/oneapi/build/src/src/vec/vec/interface/rvector.c:1112
[0]PETSC ERROR: #6 VecMDot() at /cm/shared/apps/petsc/oneapi/build/src/src/vec/vec/interface/rvector.c:1184
[0]PETSC ERROR: #7 MatNullSpaceRemove() at /cm/shared/apps/petsc/oneapi/build/src/src/mat/interface/matnull.c:359
[0]PETSC ERROR: #8 KSP_RemoveNullSpace() at /cm/shared/apps/petsc/oneapi/build/src/include/petsc/private/kspimpl.h:322
[0]PETSC ERROR: #9 KSP_PCApply() at /cm/shared/apps/petsc/oneapi/build/src/include/petsc/private/kspimpl.h:382
[0]PETSC ERROR: #10 KSPInitialResidual() at /cm/shared/apps/petsc/oneapi/build/src/src/ksp/ksp/interface/itres.c:64
[0]PETSC ERROR: #11 KSPSolve_GMRES() at /cm/shared/apps/petsc/oneapi/build/src/src/ksp/ksp/impls/gmres/gmres.c:226
[0]PETSC ERROR: #12 KSPSolve_Private() at /cm/shared/apps/petsc/oneapi/build/src/src/ksp/ksp/interface/itfunc.c:898
[0]PETSC ERROR: #13 KSPSolve() at /cm/shared/apps/petsc/oneapi/build/src/src/ksp/ksp/interface/itfunc.c:1070
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Floating point exception
[0]PETSC ERROR: trapped floating point error
[0]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!Z_V3Mo9VHcVE5kKG_z_TQ_jgrgcAJyQUPP1-I1OsT7PQgkbn1rnE5ORYi5TxwOPmLGapf_gkooDG9DZkYMM5VeI$ <https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!dXxAqHgHbtticGbE-cswI2ylVIt4A7Fn68LPFMyFtpJByLQtJocy5JF-Vf5Vzr_FbcWB1l7ve_2l9CdBfaa2Fyw$> for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.19.4, Jul 31, 2023
[0]PETSC ERROR: ./refresco on a named marclus3login2 by cklaij Thu Nov 7 11:03:22 2024
[0]PETSC ERROR: Configure options --prefix=/cm/shared/apps/petsc/oneapi/3.19.4-dbg --with-mpi-dir=/cm/shared/apps/intel/oneapi/mpi/2021.4.0 --with-x=0 --with-mpe=0 --with-debugging=1 --download-superlu_dist=../superlu_dist-8.1.2.tar.gz --with-blaslapack-dir=/cm/shared/apps/intel/oneapi/mkl/2021.4.0 --download-parmetis=../parmetis-4.0.3-p9.tar.gz --download-metis=../metis-5.1.0-p11.tar.gz --with-packages-build-dir=/cm/shared/apps/petsc/oneapi/build --with-ssl=0 --with-shared-libraries=1 CFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG" COPTFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXOPTFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG" FCFLAGS="-funroll-all-loops -O3 -DNDEBUG" F90FLAGS="-funroll-all-loops -O3 -DNDEBUG" FOPTFLAGS="-funroll-all-loops -O3 -DNDEBUG"
Abort(72) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 72) - process 0
[cid:ii_19306c695d9c0e2969a1]
dr. ir. Christiaan Klaij
| Senior Researcher | Research & Development
T +31 317 49 33 44<tel:+31%20317%2049%2033%2044> | C.Klaij at marin.nl<mailto:C.Klaij at marin.nl> | https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!Z_V3Mo9VHcVE5kKG_z_TQ_jgrgcAJyQUPP1-I1OsT7PQgkbn1rnE5ORYi5TxwOPmLGapf_gkooDG9DZkiSv2kr8$ <https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!dXxAqHgHbtticGbE-cswI2ylVIt4A7Fn68LPFMyFtpJByLQtJocy5JF-Vf5Vzr_FbcWB1l7ve_2l9CdBivdTTos$>
[Facebook]<https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!dXxAqHgHbtticGbE-cswI2ylVIt4A7Fn68LPFMyFtpJByLQtJocy5JF-Vf5Vzr_FbcWB1l7ve_2l9CdBNyhbEjY$>
[LinkedIn]<https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!dXxAqHgHbtticGbE-cswI2ylVIt4A7Fn68LPFMyFtpJByLQtJocy5JF-Vf5Vzr_FbcWB1l7ve_2l9CdBs7HDGBM$>
[YouTube]<https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!dXxAqHgHbtticGbE-cswI2ylVIt4A7Fn68LPFMyFtpJByLQtJocy5JF-Vf5Vzr_FbcWB1l7ve_2l9CdBTcqrSW8$>
--
Stefano
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image376413.png
Type: image/png
Size: 5004 bytes
Desc: image376413.png
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20241107/02b1679c/attachment.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image225686.png
Type: image/png
Size: 487 bytes
Desc: image225686.png
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20241107/02b1679c/attachment-0001.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image478264.png
Type: image/png
Size: 504 bytes
Desc: image478264.png
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20241107/02b1679c/attachment-0002.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image765396.png
Type: image/png
Size: 482 bytes
Desc: image765396.png
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20241107/02b1679c/attachment-0003.png>
More information about the petsc-users
mailing list