[petsc-users] petsc externalpackage directory

Satish Balay balay at mcs.anl.gov
Tue Feb 2 22:30:59 CST 2016


On Tue, 2 Feb 2016, Barry Smith wrote:

> 
> > On Feb 2, 2016, at 10:01 PM, Jed Brown <jed at jedbrown.org> wrote:
> > 
> > Barry Smith <bsmith at mcs.anl.gov> writes:
> > 
> >>> On Feb 2, 2016, at 7:00 PM, Jed Brown <jed at jedbrown.org> wrote:
> >>> 
> >>> Satish Balay <balay at mcs.anl.gov> writes:
> >>>> Or you can copy the whole '$PETSC_ARCH/externalpackages' dir over to
> >>>> this other machine [and place in the same location - for any
> >>>> PETSC_ARCH you plan to use..]
> >>> 
> >>> This sounds unreliable.
> >> 
> >>  In what way? 
> > 
> > You'll be building in a dirty directory (one that was used for a
> > different PETSC_ARCH).  If the project doesn't manage their dependencies
> > perfectly, it could use a build artifact from a different PETSC_ARCH,
> > which would result in linking errors (for example).
> 
>   You are right, this could be an issue

Well my recommendation was for this use case of "same install on a different machine"

And currently we do kinda support inplace repeated install of the same
package

[otherwise configure should be deleting all sources after build - and
redownload each time. We don't do that]

And even if the PETSC_ARCH1/externalpackages is copied over to
PETSC_ARCH2/externalpackages - builds should work. the primary reason
we moved externalpackages from PETSC_DIR/externalpackages to
PETSC_DIR/PETSC_ARCH/externalpackages is to support concurrent builds
in these locations.

The issues you raise come about with using incompatible versions of
packages.  This can be due to switching git branches - or rebuilding
after 'git pull' - that might have changed package dependencies.. But
thats not the use case - as I understood it..

Sure - keeping track of all the tarballs for a given petsc version is
a better approach.

> 
> >  I'd rather cache
> > the tarballs (with checksum) or use git repositories (which can be
> > reliably cleaned)

I sure would like a common cache [for any build/arch/version] for both
tarballs and git repos - so network traffic can be cut down. [say in
~/.local/petsc/cache].

> 
>   Ok then you write the code to do this. It doesn't currently exist (and I am not sure we want to be in the business of writing a full package management system :-).
> 
>    Anybody who thinks that a computer system that doesn't allow direct downloads of open source tarballs or git repositories (because of security considerations) but allows people to indirectly download those exact same tarballs or git repositories (as if that indirect download somehow magically cleanses the tarballs or repositories) is a damn fool anyways.

This is an easy way to cut-down syadmin overhead [or system
complexity/downtime] for special-use systems. For ex; JLSE has various
types of hardware. Its not easy to make sure they are secure enough
for all network traffic. So all of them [except front-end general-use
linux boxes] are firewalled off.

Other systems provide proxies [to monitor all such traffic] - This is
not easy to use via configure. [even though python urllib supports
proxies..]

Satish

> 
>   Barry
> 
> > than simply build in dirty directories that cannot be
> > cleaned.
> 
> 



More information about the petsc-users mailing list