[petsc-users] PETSc and AMPI

Barry Smith bsmith at mcs.anl.gov
Sun Feb 1 12:07:41 CST 2015


   It is inside the Charm download

   From README.ampi

Porting to AMPI
---------------
Global and static variables are unusable in virtualized AMPI programs, because
a separate copy would be needed for each VP. Therefore, to run with more than
1 VP per processor, all globals and statics must be modified to use local 
storage.

   Barry


> On Feb 1, 2015, at 10:57 AM, Matthew Knepley <knepley at gmail.com> wrote:
> 
> On Sat, Jan 31, 2015 at 10:04 PM, Jed Brown <jed at jedbrown.org> wrote:
> Matthew Knepley <knepley at gmail.com> writes:
> 
> > On Sat, Jan 31, 2015 at 4:19 PM, Satish Balay <balay at mcs.anl.gov> wrote:
> >
> >> Dan,
> >>
> >> I'm forwarding this to e-mail petsc-users. I'm not familiar with
> >> Charm++ or AMPI - but others might have suggestions.
> >>
> >> Also - if you can send us confiugre.log the the failure with AMPI - we
> >> can look at it - and see if there is an issue from PETSc side.
> >>
> >
> > Also, I cannot find the download for AMPI. Can you mail it so we can try it
> > here?
> 
> http://charm.cs.uiuc.edu/research/ampi/
> 
> Barry experimented with this a while back.  It is not currently
> supported and my understanding is that PETSc would need public API
> changes to support AMPI.  This might be possible as part of the
> thread-safety work.
> 
> I went there before, but there is no download link.
> 
> I know Barry did this before, but now they are telling everyone that is an MPI
> implementation.
> 
>    Matt
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener



More information about the petsc-users mailing list