[petsc-users] null space problem with -pc_type lu on single proc
Stefano Zampini
stefano.zampini at gmail.com
Thu Nov 7 07:20:27 CST 2024
the default LU solver in sequential is the PETSc one which does not support
pivoting or singular problems. In parallel, it is either MUMPS or
SUPERLU_DIST, depending on your configurations.
MUMPS for example can handle singular problem, not sure about superlu_dist.
You can run the parallel version with -ksp_view and see what is the solver
package used.
Supposing it is mumps, you can run the sequential code with
-pc_factor_mat_solver_type mumps
Il giorno gio 7 nov 2024 alle ore 13:20 Klaij, Christiaan via petsc-users <
petsc-users at mcs.anl.gov> ha scritto:
> I'm trying to solve a system with a single, non-constant, null space
> vector, confirmed by passing the MatNullSpaceTest. Solving the system works
> fine with -pc_type ilu on one or multiple procs. It also works fine with
> -pc_type lu on multiple procs but fails on a single proc. Any idea what
> could be wrong?
>
> Chris
>
> [0]PETSC ERROR: *** unknown floating point error occurred ***
> [0]PETSC ERROR: The specific exception can be determined by running in a
> debugger. When the
> [0]PETSC ERROR: debugger traps the signal, the exception can be found with
> fetestexcept(0x3f)
> [0]PETSC ERROR: where the result is a bitwise OR of the following flags:
> [0]PETSC ERROR: FE_INVALID=0x1 FE_DIVBYZERO=0x4 FE_OVERFLOW=0x8
> FE_UNDERFLOW=0x10 FE_INEXACT=0x20
> [0]PETSC ERROR: Try option -start_in_debugger
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: --------------------- Stack Frames
> ------------------------------------
> [0]PETSC ERROR: The line numbers in the error traceback are not always
> exact.
> [0]PETSC ERROR: #1 PetscDefaultFPTrap() at
> /cm/shared/apps/petsc/oneapi/build/src/src/sys/error/fp.c:487
> [0]PETSC ERROR: #2 VecMDot_Seq() at
> /cm/shared/apps/petsc/oneapi/build/src/src/vec/vec/impls/seq/dvec2.c:189
> [0]PETSC ERROR: #3 VecMXDot_MPI_Default() at
> /cm/shared/apps/petsc/oneapi/build/src/include/../src/vec/vec/impls/mpi/pvecimpl.h:96
> [0]PETSC ERROR: #4 VecMDot_MPI() at
> /cm/shared/apps/petsc/oneapi/build/src/src/vec/vec/impls/mpi/pvec2.c:25
> [0]PETSC ERROR: #5 VecMXDot_Private() at
> /cm/shared/apps/petsc/oneapi/build/src/src/vec/vec/interface/rvector.c:1112
> [0]PETSC ERROR: #6 VecMDot() at
> /cm/shared/apps/petsc/oneapi/build/src/src/vec/vec/interface/rvector.c:1184
> [0]PETSC ERROR: #7 MatNullSpaceRemove() at
> /cm/shared/apps/petsc/oneapi/build/src/src/mat/interface/matnull.c:359
> [0]PETSC ERROR: #8 KSP_RemoveNullSpace() at
> /cm/shared/apps/petsc/oneapi/build/src/include/petsc/private/kspimpl.h:322
> [0]PETSC ERROR: #9 KSP_PCApply() at
> /cm/shared/apps/petsc/oneapi/build/src/include/petsc/private/kspimpl.h:382
> [0]PETSC ERROR: #10 KSPInitialResidual() at
> /cm/shared/apps/petsc/oneapi/build/src/src/ksp/ksp/interface/itres.c:64
> [0]PETSC ERROR: #11 KSPSolve_GMRES() at
> /cm/shared/apps/petsc/oneapi/build/src/src/ksp/ksp/impls/gmres/gmres.c:226
> [0]PETSC ERROR: #12 KSPSolve_Private() at
> /cm/shared/apps/petsc/oneapi/build/src/src/ksp/ksp/interface/itfunc.c:898
> [0]PETSC ERROR: #13 KSPSolve() at
> /cm/shared/apps/petsc/oneapi/build/src/src/ksp/ksp/interface/itfunc.c:1070
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Floating point exception
> [0]PETSC ERROR: trapped floating point error
> [0]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!YEL_7P5zQIFYD7bhZCjy9YmnEJKPrk8uuZ-HgzitChAgA3TpOY2MKoRQcBcC6LuKxc3x8WMoVQ62WS53A7XCLS4niHz5JWI$
> <https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!dXxAqHgHbtticGbE-cswI2ylVIt4A7Fn68LPFMyFtpJByLQtJocy5JF-Vf5Vzr_FbcWB1l7ve_2l9CdBfaa2Fyw$>
> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.19.4, Jul 31, 2023
> [0]PETSC ERROR: ./refresco on a named marclus3login2 by cklaij Thu Nov 7
> 11:03:22 2024
> [0]PETSC ERROR: Configure options
> --prefix=/cm/shared/apps/petsc/oneapi/3.19.4-dbg
> --with-mpi-dir=/cm/shared/apps/intel/oneapi/mpi/2021.4.0 --with-x=0
> --with-mpe=0 --with-debugging=1
> --download-superlu_dist=../superlu_dist-8.1.2.tar.gz
> --with-blaslapack-dir=/cm/shared/apps/intel/oneapi/mkl/2021.4.0
> --download-parmetis=../parmetis-4.0.3-p9.tar.gz
> --download-metis=../metis-5.1.0-p11.tar.gz
> --with-packages-build-dir=/cm/shared/apps/petsc/oneapi/build --with-ssl=0
> --with-shared-libraries=1 CFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3
> -DNDEBUG" CXXFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG"
> COPTFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG"
> CXXOPTFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG"
> FCFLAGS="-funroll-all-loops -O3 -DNDEBUG" F90FLAGS="-funroll-all-loops -O3
> -DNDEBUG" FOPTFLAGS="-funroll-all-loops -O3 -DNDEBUG"
> Abort(72) on node 0 (rank 0 in comm 0): application called
> MPI_Abort(MPI_COMM_WORLD, 72) - process 0
> dr. ir. Christiaan Klaij
> | Senior Researcher | Research & Development
> T +31 317 49 33 44 <+31%20317%2049%2033%2044> | C.Klaij at marin.nl |
> https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!YEL_7P5zQIFYD7bhZCjy9YmnEJKPrk8uuZ-HgzitChAgA3TpOY2MKoRQcBcC6LuKxc3x8WMoVQ62WS53A7XCLS4nR49qdbI$
> <https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!dXxAqHgHbtticGbE-cswI2ylVIt4A7Fn68LPFMyFtpJByLQtJocy5JF-Vf5Vzr_FbcWB1l7ve_2l9CdBivdTTos$>
> [image: Facebook]
> <https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!dXxAqHgHbtticGbE-cswI2ylVIt4A7Fn68LPFMyFtpJByLQtJocy5JF-Vf5Vzr_FbcWB1l7ve_2l9CdBNyhbEjY$>
> [image: LinkedIn]
> <https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!dXxAqHgHbtticGbE-cswI2ylVIt4A7Fn68LPFMyFtpJByLQtJocy5JF-Vf5Vzr_FbcWB1l7ve_2l9CdBs7HDGBM$>
> [image: YouTube]
> <https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!dXxAqHgHbtticGbE-cswI2ylVIt4A7Fn68LPFMyFtpJByLQtJocy5JF-Vf5Vzr_FbcWB1l7ve_2l9CdBTcqrSW8$>
>
--
Stefano
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20241107/bf3d5b72/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image376413.png
Type: image/png
Size: 5004 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20241107/bf3d5b72/attachment-0004.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image225686.png
Type: image/png
Size: 487 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20241107/bf3d5b72/attachment-0005.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image478264.png
Type: image/png
Size: 504 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20241107/bf3d5b72/attachment-0006.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image765396.png
Type: image/png
Size: 482 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20241107/bf3d5b72/attachment-0007.png>
More information about the petsc-users
mailing list