[petsc-users] A bad commit affects MOOSE

Kong, Fande fande.kong at inl.gov
Mon Apr 2 17:19:33 CDT 2018


Nope.

There is a back trace:




































** thread #1: tid = 0x3b477b4, 0x00007fffb306cd42
libsystem_kernel.dylib`__pthread_kill + 10, queue =
'com.apple.main-thread', stop reason = signal SIGABRT  * frame #0:
0x00007fffb306cd42 libsystem_kernel.dylib`__pthread_kill + 10    frame #1:
0x00007fffb315a457 libsystem_pthread.dylib`pthread_kill + 90    frame #2:
0x00007fffb2fd2420 libsystem_c.dylib`abort + 129    frame #3:
0x00000001057ff30a
libpetsc.3.07.dylib`Petsc_MPI_AbortOnError(comm=<unavailable>,
flag=<unavailable>) + 26 at init.c:185 [opt]    frame #4:
0x0000000106bd3245 libpmpi.12.dylib`MPIR_Err_return_comm + 533    frame #5:
0x00000001068defd4 libmpi.12.dylib`MPI_Comm_create + 3492    frame #6:
0x00000001061345d9
libpetsc.3.07.dylib`hypre_GenerateSubComm(comm=-1006627852,
participate=<unavailable>, new_comm_ptr=<unavailable>) + 409 at
gen_redcs_mat.c:531 [opt]    frame #7: 0x000000010618f8ba
libpetsc.3.07.dylib`hypre_GaussElimSetup(amg_data=0x00007fe7ff857a00,
level=<unavailable>, relax_type=9) + 74 at par_relax.c:4209 [opt]    frame
#8: 0x0000000106140e93
libpetsc.3.07.dylib`hypre_BoomerAMGSetup(amg_vdata=<unavailable>,
A=0x00007fe80842aff0, f=0x00007fe80842a980, u=0x00007fe80842a510) + 17699
at par_amg_setup.c:2108 [opt]    frame #9: 0x0000000105ec773c
libpetsc.3.07.dylib`PCSetUp_HYPRE(pc=<unavailable>) + 2540 at hypre.c:226
[opt]    frame #10: 0x0000000105eea68d
libpetsc.3.07.dylib`PCSetUp(pc=0x00007fe805553f50) + 797 at precon.c:968
[opt]    frame #11: 0x0000000105ee9fe5
libpetsc.3.07.dylib`PCApply(pc=0x00007fe805553f50, x=0x00007fe80052d420,
y=0x00007fe800522c20) + 181 at precon.c:478 [opt]    frame #12:
0x00000001015cf218
libmesh_opt.0.dylib`libMesh::PetscPreconditioner<double>::apply(libMesh::NumericVector<double>
const&, libMesh::NumericVector<double>&) + 24    frame #13:
0x00000001009c7998
libmoose-opt.0.dylib`PhysicsBasedPreconditioner::apply(libMesh::NumericVector<double>
const&, libMesh::NumericVector<double>&) + 520    frame #14:
0x00000001016ad701 libmesh_opt.0.dylib`libmesh_petsc_preconditioner_apply +
129    frame #15: 0x0000000105e7e715
libpetsc.3.07.dylib`PCApply_Shell(pc=0x00007fe8052623f0,
x=0x00007fe806805a20, y=0x00007fe806805420) + 117 at shellpc.c:123 [opt]
frame #16: 0x0000000105eea079
libpetsc.3.07.dylib`PCApply(pc=0x00007fe8052623f0, x=0x00007fe806805a20,
y=0x00007fe806805420) + 329 at precon.c:482 [opt]    frame #17:
0x0000000105eeb611 libpetsc.3.07.dylib`PCApplyBAorAB(pc=0x00007fe8052623f0,
side=PC_RIGHT, x=0x00007fe806805a20, y=0x00007fe806806020,
work=0x00007fe806805420) + 945 at precon.c:714 [opt]    frame #18:
0x0000000105f31658 libpetsc.3.07.dylib`KSPGMRESCycle [inlined]
KSP_PCApplyBAorAB(ksp=0x00007fe806022220, x=<unavailable>,
y=0x00007fe806806020, w=<unavailable>) + 191 at kspimpl.h:295 [opt]
frame #19: 0x0000000105f31599
libpetsc.3.07.dylib`KSPGMRESCycle(itcount=<unavailable>, ksp=<unavailable>)
+ 553 at gmres.c:156 [opt]    frame #20: 0x0000000105f326bd
libpetsc.3.07.dylib`KSPSolve_GMRES(ksp=<unavailable>) + 221 at gmres.c:240
[opt]    frame #21: 0x0000000105f5f671
libpetsc.3.07.dylib`KSPSolve(ksp=0x00007fe806022220, b=0x00007fe7fd946220,
x=<unavailable>) + 1345 at itfunc.c:677 [opt]    frame #22:
0x0000000105fd0251
libpetsc.3.07.dylib`SNESSolve_NEWTONLS(snes=<unavailable>) + 1425 at
ls.c:230 [opt]    frame #23: 0x0000000105fa10ca
libpetsc.3.07.dylib`SNESSolve(snes=<unavailable>, b=<unavailable>,
x=0x00007fe7fd865e20) + 858 at snes.c:4128 [opt]    frame #24:
0x00000001016b63c3
libmesh_opt.0.dylib`libMesh::PetscNonlinearSolver<double>::solve(libMesh::SparseMatrix<double>&,
libMesh::NumericVector<double>&, libMesh::NumericVector<double>&, double,
unsigned int) + 835    frame #25: 0x00000001016fc244
libmesh_opt.0.dylib`libMesh::NonlinearImplicitSystem::solve() + 324
frame #26: 0x0000000100a71dc8 libmoose-opt.0.dylib`NonlinearSystem::solve()
+ 472    frame #27: 0x00000001009fe815
libmoose-opt.0.dylib`FEProblemBase::solve() + 117    frame #28:
0x0000000100761fba libmoose-opt.0.dylib`Steady::execute() + 266    frame
#29: 0x0000000100b78ac3 libmoose-opt.0.dylib`MooseApp::run() + 259    frame
#30: 0x00000001003843aa moose_test-opt`main + 122    frame #31:
0x00007fffb2f3e235 libdyld.dylib`start + 1*
Fande,


On Mon, Apr 2, 2018 at 4:02 PM, Stefano Zampini <stefano.zampini at gmail.com>
wrote:

> maybe this will fix ?
>
>
> *diff --git a/src/ksp/pc/impls/hypre/hypre.c
> b/src/ksp/pc/impls/hypre/hypre.c*
>
> *index 28addcf533..6a756d4c57 100644*
>
> *--- a/src/ksp/pc/impls/hypre/hypre.c*
>
> *+++ b/src/ksp/pc/impls/hypre/hypre.c*
>
> @@ -142,8 +142,7 @@ static PetscErrorCode PCSetUp_HYPRE(PC pc)
>
>
>
>    ierr = PetscObjectTypeCompare((PetscObject)pc->pmat,MATHYPRE,
> &ishypre);CHKERRQ(ierr);
>
>    if (!ishypre) {
>
> -    ierr = MatDestroy(&jac->hpmat);CHKERRQ(ierr);
>
> -    ierr = MatConvert(pc->pmat,MATHYPRE,MAT_INITIAL_MATRIX,&jac->
> hpmat);CHKERRQ(ierr);
>
> +    ierr = MatConvert(pc->pmat,MATHYPRE,jac->hpmat ? MAT_REUSE_MATRIX :
> MAT_INITIAL_MATRIX,&jac->hpmat);CHKERRQ(ierr);
>
>    } else {
>
>      ierr = PetscObjectReference((PetscObject)pc->pmat);CHKERRQ(ierr);
>
>      ierr = MatDestroy(&jac->hpmat);CHKERRQ(ierr);
>
>
>
> 2018-04-02 23:46 GMT+02:00 Kong, Fande <fande.kong at inl.gov>:
>
>> Hi All,
>>
>> I am trying to upgrade PETSc from 3.7.6 to 3.8.3 for MOOSE and its
>> applications. I have a error message for a standard test:
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> *preconditioners/pbp.lots_of_variables: MPI had an
>> errorpreconditioners/pbp.lots_of_variables:
>> ------------------------------------------------preconditioners/pbp.lots_of_variables:
>> Other MPI error, error stack:preconditioners/pbp.lots_of_variables:
>> PMPI_Comm_dup(177)..................: MPI_Comm_dup(comm=0x84000001,
>> new_comm=0x97d1068) failedpreconditioners/pbp.lots_of_variables:
>> PMPI_Comm_dup(162)..................:
>> preconditioners/pbp.lots_of_variables:
>> MPIR_Comm_dup_impl(57)..............:
>> preconditioners/pbp.lots_of_variables:
>> MPIR_Comm_copy(739).................:
>> preconditioners/pbp.lots_of_variables:
>> MPIR_Get_contextid_sparse_group(614): Too many communicators (0/2048 free
>> on this process; ignore_id=0)*
>>
>>
>> I did "git bisect', and the following commit introduces this issue:
>>
>>
>>
>>
>>
>>
>>
>>
>> *commit 49a781f5cee36db85e8d5b951eec29f10ac13593Author: Stefano Zampini
>> <stefano.zampini at gmail.com <stefano.zampini at gmail.com>>Date:   Sat Nov 5
>> 20:15:19 2016 +0300    PCHYPRE: use internal Mat of type MatHYPRE
>> hpmat already stores two HYPRE vectors*
>>
>> Before I debug line-by-line, anyone has a clue on this?
>>
>>
>> Fande,
>>
>
>
>
> --
> Stefano
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180402/c716f2c2/attachment-0001.html>


More information about the petsc-users mailing list