[petsc-users] (no subject)

Matthew Knepley knepley at gmail.com
Thu Jun 2 11:15:08 CDT 2016


On Thu, Jun 2, 2016 at 11:10 AM, neok m4700 <neok.m4700 at gmail.com> wrote:

> Hi Satish,
>
> Thanks for the correction.
>
> The error message is now slightly different, but the result is the same
> (serial runs fine, parallel with mpirun fails with following error):
>

Now the error is correct. You are asking to run ICC in parallel, which we
do not support. It is telling you
to look at the table of available solvers.

  Thanks,

    Matt


> [0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
> [0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
> [0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
> [0] PCSetUp_ICC() line 21 in<...>/src/ksp/pc/impls/factor/icc/icc.c
> [0] MatGetFactor() line 4291 in <...>/src/mat/interface/matrix.c
> [0] See http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html
> for possible LU and Cholesky solvers
> [0] Could not locate a solver package. Perhaps you must ./configure with
> --download-<package>
>
>
>
>
> 2016-06-02 17:58 GMT+02:00 Satish Balay <balay at mcs.anl.gov>:
>
>> with petsc-master - you would have to use petsc4py-master.
>>
>> i.e try petsc-eab7b92 with petsc4py-6e8e093
>>
>> Satish
>>
>>
>> On Thu, 2 Jun 2016, neok m4700 wrote:
>>
>> > Hi Matthew,
>> >
>> > I've rebuilt petsc // petsc4py with following versions:
>> >
>> > 3.7.0 // 3.7.0 => same runtime error
>> > 00c67f3 // 3.7.1 => fails to build petsc4py (error below)
>> > 00c67f3 // 6e8e093 => same as above
>> > f1b0812 (latest commit) // 6e8e093 (latest commit) => same as above
>> >
>> > In file included from src/PETSc.c:3:0:
>> > src/petsc4py.PETSc.c: In function
>> > ‘__pyx_pf_8petsc4py_5PETSc_6DMPlex_4createBoxMesh’:
>> > src/petsc4py.PETSc.c:214629:112: error: incompatible type for argument
>> 4 of
>> > ‘DMPlexCreateBoxMesh’
>> >    __pyx_t_4 =
>> > __pyx_f_8petsc4py_5PETSc_CHKERR(DMPlexCreateBoxMesh(__pyx_v_ccomm,
>> > __pyx_v_cdim, __pyx_v_interp, (&__pyx_v_newdm))); if
>> (unlikely(__pyx_t_4 ==
>> > -1)) __PYX_ERR(42, 49, __pyx_L1_error)
>> >
>> > using
>> > - numpy 1.11.0
>> > - openblas 0.2.18
>> > - openmpi 1.10.2
>> >
>> > Thanks
>> >
>> > 2016-06-02 16:39 GMT+02:00 Matthew Knepley <knepley at gmail.com>:
>> >
>> > > On Thu, Jun 2, 2016 at 9:12 AM, neok m4700 <neok.m4700 at gmail.com>
>> wrote:
>> > >
>> > >> Hi,
>> > >>
>> > >> I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and ran
>> the
>> > >> examples in the demo directory.
>> > >>
>> > >
>> > > I believe this was fixed in 'master':
>> > >
>> https://bitbucket.org/petsc/petsc/commits/00c67f3b09c0bcda06af5ed306d845d9138e5003
>> > >
>> > > Is it possible to try this?
>> > >
>> > >   Thanks,
>> > >
>> > >     Matt
>> > >
>> > >
>> > >> $ python test_mat_ksp.py
>> > >> => runs as expected (serial)
>> > >>
>> > >> $ mpiexec -np 2 python test_mat_ksp.py
>> > >> => fails with the following output:
>> > >>
>> > >> Traceback (most recent call last):
>> > >>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 15, in <module>
>> > >>     execfile('petsc-ksp.py')
>> > >>   File "<...>/demo/kspsolve/test_mat_ksp.py", line 6, in execfile
>> > >>     try: exec(fh.read()+"\n", globals, locals)
>> > >>   File "<string>", line 15, in <module>
>> > >>   File "PETSc/KSP.pyx", line 384, in petsc4py.PETSc.KSP.solve
>> > >> (src/petsc4py.PETSc.c:153555)
>> > >> petsc4py.PETSc.Error: error code 92
>> > >> [0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
>> > >> [0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interface/itfunc.c
>> > >> [0] PCSetUp() line 968 in <...>/src/ksp/pc/interface/precon.c
>> > >> [0] PCSetUp_ICC() line 21 in <...>/src/ksp/pc/impls/factor/icc/icc.c
>> > >> [0] MatGetFactor() line 4240 in<...>/src/mat/interface/matrix.c
>> > >> [0] You cannot overwrite this option since that will conflict with
>> other
>> > >> previously set options
>> > >> [0] Could not locate solver package (null). Perhaps you must
>> ./configure
>> > >> with --download-(null)
>> > >>
>> > >> <...>
>> > >> -------------------------------------------------------
>> > >> Primary job  terminated normally, but 1 process returned
>> > >> a non-zero exit code.. Per user-direction, the job has been aborted.
>> > >> -------------------------------------------------------
>> > >>
>> --------------------------------------------------------------------------
>> > >> mpirun detected that one or more processes exited with non-zero
>> status,
>> > >> thus causing
>> > >> the job to be terminated. The first process to do so was:
>> > >>
>> > >>   Process name: [[23110,1],0]
>> > >>   Exit code:    1
>> > >>
>> --------------------------------------------------------------------------
>> > >>
>> > >>
>> > >> What have I done wrong ?
>> > >>
>> > >>
>> > >>
>> > >>
>> > >
>> > >
>> > > --
>> > > What most experimenters take for granted before they begin their
>> > > experiments is infinitely more interesting than any results to which
>> their
>> > > experiments lead.
>> > > -- Norbert Wiener
>> > >
>> >
>>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160602/574093d8/attachment.html>


More information about the petsc-users mailing list