[petsc-users] CG+GAMG convergence issues in GHEP Krylov-Schur for some MPI runs
Mark Adams
mfadams at lbl.gov
Wed Nov 11 07:36:46 CST 2015
On Tue, Nov 10, 2015 at 11:15 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> Please send me the full output. This is nuts and should be reported once
> we understand it better to NERSc as something to be fixed. When I pay $60
> million in taxes to a computing center I expect something that works fine
> for free on my laptop to work also there.
>
> Barry
>
> > On Nov 10, 2015, at 7:51 AM, Mark Adams <mfadams at lbl.gov> wrote:
> >
> > I ran an 8 processor job on Edison of a small code for a short run (just
> a linear solve) and got 37 Mb of output!
> >
> > Here is a 'Petsc' grep.
> >
> > Perhaps we should build an ignore file for things that we believe is a
> false positive.
> >
> > On Tue, Nov 3, 2015 at 11:55 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> > I am more optimistic about valgrind than Mark. I first try valgrind
> and if that fails to be helpful then use the debugger. valgrind has the
> advantage that it finds the FIRST place that something is wrong, while in
> the debugger it is kind of late at the crash.
> >
> > Valgrind should not be noisy, if it is then the applications/libraries
> should be cleaned up so that they are valgrind clean and then valgrind is
> useful.
> >
> > Barry
> >
> >
> >
> > > On Nov 3, 2015, at 7:47 AM, Mark Adams <mfadams at lbl.gov> wrote:
> > >
> > > BTW, I think that our advice for segv is use a debugger. DDT or
> Totalview, and gdb if need be, will get you right to the source code and
> will get 90% of bugs diagnosed. Valgrind is noisy and cumbersome to use
> but can diagnose 90% of the other 10%.
> > >
> > > On Tue, Nov 3, 2015 at 7:32 AM, Denis Davydov <davydden at gmail.com>
> wrote:
> > > Hi Jose,
> > >
> > > > On 3 Nov 2015, at 12:20, Jose E. Roman <jroman at dsic.upv.es> wrote:
> > > >
> > > > I am answering the SLEPc-related questions:
> > > > - Having different number of iterations when changing the number of
> processes is normal.
> > > the change in iterations i mentioned are for different
> preconditioners, but the same number of MPI processes.
> > >
> > >
> > > > - Yes, if you do not destroy the EPS solver, then the preconditioner
> would be reused.
> > > >
> > > > Regarding the segmentation fault, I have no clue. Not sure if this
> is related to GAMG or not. Maybe running under valgrind could provide more
> information.
> > > will try that.
> > >
> > > Denis.
> > >
> >
> >
> > <petsc_val.gz>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151111/f39b6c10/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: outval.gz
Type: application/x-gzip
Size: 57974 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151111/f39b6c10/attachment-0001.bin>
More information about the petsc-users
mailing list