Problems using external packages Spooles and Hypre with PETSc v2. 3.1

Stephen R Ball Stephen.R.Ball at awe.co.uk
Tue Jun 13 09:44:54 CDT 2006



Thanks Hong that appears to have fixed it.

At the very beginning of this post, I also mentioned that I was having some
problems using Hypre with PETSc v2.3.1.
I'll repeat my message below:


I am using Hypre v1.10.0b downloaded from the LLNL site and built separatly
to PETSc. When I have been using Pilut and Parasails as either global
problem preconditioners of as sub preconditioners for ASM or BJACOBI, I have
been getting errors indicating that there could be a memory access out of
range. I suspect that there may be a memory overwrite or mismatch occuring
somewhere. Note that BOOMERAMG and EUCLID seem to work OK. Again these
problems were not apparent when I was using PETSc v2.3.0.

Have there been any significant changes in the PETSc source that handles
Hypre between versions 2.3.0 and 2.3.1 that could be causing these problems?
I don't really want to go back to using PETSc v2.3.0 if I can help it. I
don't know if it makes any difference, but I am using the Fortran interface
to PETSc.


Regards

Stephen



-----Original Message-----
From: Hong Zhang [mailto:hzhang at mcs.anl.gov] 
Sent: 12 June 2006 22:57
To: Stephen R Ball
Cc: petsc-maint at mcs.anl.gov
Subject: EXTERNAL: Re: Problems using external packages Spooles and Hypre
with PETS c v2. 3.1 


Stephen,

The crash is caused by a bug in spooles
(int overflow).

You can fix your spooles by editing
1) spooles-2.2/I2Ohash/src/util.c:

replace line 42, 161, 232
loc  = (loc1*loc2) % hashtable->nlist ;
with
loc  = (int)(((long long)loc1*loc2) % hashtable->nlist) ;

2) reconfigure your petsc
with '--download-spooles=ifneeded'
and the options you have previously used.

3) rebuild petsc libraries

I've put an updated spooles tarball at petsc ftp site
(spooles-2.2-June_2006.tar.gz). petsc-dev will enable its auto download in
few days.

Let us know if you still have trouble.

Thanks for reporting the bug,

Hong

On Mon, 12 Jun 2006, Stephen R Ball wrote:

>
>
> Hong
>
> The copy of Spooles I have was downloaded on or around 26-09-05. I 
> have checked it again and it still works for me with PETSc v2.3.0.
>
> Stephen
>
>
> -----Original Message-----
> From: Hong Zhang [mailto:hzhang at mcs.anl.gov]
> Sent: 10 June 2006 04:21
> To: Stephen Ball
> Cc: petsc-maint at mcs.anl.gov
> Subject: Re: FW: Problems using external packages Spooles and Hy pre 
> with PETSc v2. 3.1
>
>
>
> Stephen,
>
> I tested 'mat' and 'rhs' with PETSc v2.3.0 and spooles installed from 
> spooles-2.2-Sept_2005-orig.tar.gz (the oldest spooles tarball for 
> petsc
> download) and the updated tarball.
>
> They all crashes within spooles function
> #0  0x085dae4b in I2Ohash_insert (hashtable=0xad820d8, key1=46355,
> key2=46355,       ^^^^^^^^^^^^^^
>     value=0xa44fc88) at util.c:96
> #1  0x085d15d8 in FrontMtx_splitUpperMatrices (frontmtx=0x8d6df38, 
> msglvl=0,
> msgFile=0x0)
>     at split.c:79
> #2  0x085cd94d in FrontMtx_postProcess (frontmtx=0x8d6df38, msglvl=0,
> msgFile=0x0)
>     at postProcess.c:141
> #3  0x081e901d in MatFactorNumeric_SeqAIJSpooles (A=0x8921010,
> info=0x8939204,
>     F=0x89391e0) at spooles.c:486
>
> Which version of spooles gives you successful ran on this problem with 
> petsc-2.3.0?
>
> Hong
>
>
>
> On Thu, 8 Jun 2006, Stephen Ball wrote:
>
> >
> >
> >
> > Hong
> >
> > I have uploaded the problem to "ftp://ftp.mcs.anl.gov/incoming". The 
> > filename is "spooles_fail_problem.tar.gz". Let me know if you 
> > receive it OK.
> >
> > Stephen
> >
> >
> >
> > -----Original Message-----
> > From: Satish Balay [mailto:balay at mcs.anl.gov]
> > Sent: 06 June 2006 17:34
> > To: petsc-users at mcs.anl.gov
> > Subject: EXTERNAL: RE: EXTERNAL: RE: EXTERNAL: Re: Problems using 
> > external packages Spooles and Hy pre with PETSc v2. 3.1
> >
> >
> > I think Barry ment to e-mail to petsc-maint or hzhang [@mcs.anl.gov]
> >
> > How big is the file? It it is very big [and not suitable for e-mail] 
> > then you can upload it to ftp://ftp.mcs.anl.gov/incoming [and send 
> > us e-mail with the name of the file]
> >
> > Satish
> >
> > On Tue, 6 Jun 2006, Stephen R Ball wrote:
> >
> > >
> > >
> > > Hi
> > >
> > > Yes, I will post you the problem on CD. Can you give me the full 
> > > postal address I should mail it to.
> > >
> > > Regards
> > >
> > > Stephen
> > >
> > >
> > >
> > > -----Original Message-----
> > > From: Barry Smith [mailto:bsmith at mcs.anl.gov]
> > > Sent: 05 June 2006 17:21
> > > To: petsc-users at mcs.anl.gov
> > > Subject: EXTERNAL: RE: EXTERNAL: Re: Problems using external 
> > > packages Spooles and Hy pre with PETSc v2. 3.1
> > >
> > >
> > >    Stephen,
> > >
> > >     Could you send us or post for us the matrix that causes the 
> > > crash? Send directly to Hong.
> > >
> > >     Thanks,
> > >
> > >      Barry
> > >
> > >
> > > On Mon, 5 Jun 2006, Stephen R Ball wrote:
> > >
> > > >
> > > >
> > > >
> > > >
> > > > Hi
> > > >
> > > > Your example runs OK, as do some of my other problems. But with 
> > > > some
> >
> > > > problems PETSc v2.3.1 fails while PETSc v2.3.0 works. I have 
> > > > attached the output from both PETSc versions for one of these 
> > > > problems. If you are interested, perhaps we could make 
> > > > arrangements for me sending you one of these problems.
> > > >
> > > > Regards
> > > >
> > > > Stephen
> > > >
> > > >
> > > >
> > > >
> > > > -----Original Message-----
> > > > From: Hong Zhang [mailto:hzhang at mcs.anl.gov]
> > > > Sent: 02 June 2006 14:47
> > > > To: petsc-users at mcs.anl.gov
> > > > Subject: EXTERNAL: Re: Problems using external packages Spooles 
> > > > and Hypre with PETSc v2. 3.1
> > > >
> > > >
> > > > Stephen,
> > > >
> > > >>
> > > >> I am using Spooles downloaded from the PETSc ftp site and built 
> > > >> into PETSC_DIR/externalpackages. When solving a partcular 
> > > >> problem
>
> > > >> I am getting the error:
> > > >>
> > > >> MatFactorNumeric_MPIAIJSpooles line 199 
> > > >> src/mat/impls/aij/mpi/spooles/mpispooles.c
> > > >> MatLUFactorNumeric line 2200 src/mat/interface/matix.c KSPSetUp 
> > > >> line 183 src/ksp/ksp/interface/itfunc.c KSPSolve line 305 
> > > >> src/ksp/ksp/interface/itfunc.c
> > > >
> > > > May I have the complete error message after the crash? Can you 
> > > > run ~petsc/src/ksp/ksp/examples/tutorials/ex5.c
> > > > with the cmd:
> > > >
> > > > mpirun -np 2 ./ex5 -ksp_type preonly -pc_type lu -mat_type 
> > > > aijspooles?
> > > >
> > > > Hong
> > > >
> > > >>
> > > >> The problem also occurs in serial.
> > > >>
> > > >> I am using Hypre v1.10.0b downloaded from the LLNL site and 
> > > >> built
>
> > > >> separatly to PETSc. When I have been using Pilut and Parasails 
> > > >> as
>
> > > >> either global problem preconditioners of as sub preconditioners 
> > > >> for
> >
> > > >> ASM or BJACOBI, I have been getting errors indicating that 
> > > >> there could be a memory access out of range. I suspect that 
> > > >> there may be a memory overwrite or mismatch occuring somewhere. 
> > > >> Note that BOOMERAMG and EUCLID seem to work OK. Again these 
> > > >> problems were not
> >
> > > >> apparent when I was using PETSc v2.3.0.
> > > >>
> > > >> Have there been any significant changes in the PETSc source 
> > > >> that handles Spooles and Hypre between versions 2.3.0 and 2.3.1 
> > > >> that could be causing these problems? I don't really want to go 
> > > >> back to using PETSc v2.3.0 if I can help it. I don't know if it 
> > > >> makes any difference, but I am using the Fortran interface to 
> > > >> PETSc.
> > > >>
> > > >> Hope you can help.
> > > >>
> > > >> Regards
> > > >>
> > > >> Stephen R. Ball
> > > >> -- 
> > > >> __________
--
_______________________________________________________________________________

The information in this email and in any attachment(s) is commercial in confidence. If you are not the named addressee(s) or if you receive this email in error then any distribution, copying or use of this communication or the information in it is strictly prohibited.  Please notify us immediately by email at admin.internet(at)awe.co.uk, and then delete this message from your computer.  While attachments are virus checked, AWE plc does not accept any liability in respect of any virus which is not detected.

AWE Plc
Registered in England and Wales
Registration No 02763902
AWE, Aldermaston, Reading, RG7 4PR




More information about the petsc-users mailing list