[Petsc-trilinos-discussion] Scope and requirements

Jed Brown jedbrown at mcs.anl.gov
Fri Nov 22 11:34:18 CST 2013


"Bartlett, Roscoe A." <bartlettra at ornl.gov> writes:

> In Trilinos, the basic concept is that all objects should print to an
> arbitrary Teuchos::FancyOStream object (which takes any arbitrary
> std::ostream object).  The underlying std::ostream object has an
> abstract stream buffer object that can be overridden to send output
> anywhere.  The class Teuchos::FancyOStream contains some utilities to
> improve the formatting of the output (like adding indentation in a
> nice way, adding prefixes for the procID, etc.) and has other little
> features (but has no real dependence on MPI or anything else).  

Where does it get the procID without MPI?  How does it collate output
From multiple ranks without MPI?

> Does PETSc allow the user to provide a polymorphic object
> (i.e. a struct with void* and a set of function pointers) to allow the
> user to send output anywhere?  

That's what PetscViewer objects are for.

> You would provided a virtual version of printf(...) basically for
> PETSc object to use.  In standard use case would just be a
> fall-through call to printf(...).

Users can also override PetscVFPrintf(), but that is lower level and has
discarded the object association.  It is common for users to call
PetscPrintf() directly, but PETSc library code never does it.

> The hard part is getting the std::ostream object into the objects that
> you want and telling them how verbose you want them to be.  

Instead of a verbosity "level", we use named monitors for each type of
diagnostic.  Those diagnostics can go to the same place or different
places.  Depending on how the user would like to multiplex, they might
need to add the monitor from code (i.e., make a function call) instead
of just run-time options, but most can be configured with run-time
options.

> For a little more info on this approach, see GCG 16 "Always send
> output to some general std::ostream object; Never send output directly
> to std::cout or std::cerr; Never print output with print(...) or
> printf(...)" in:
>
>     http://web.ornl.gov/~8vt/TrilinosCodingDocGuidelines.pdf

All output from the library is performed via PetscViewer objects.  Most
diagnostics are provided by monitors, each of which has a
(reference-counted, possibly-shared) PetscViewer.  Monitors are not part
of the solver objects themselves, and you can register any number of
monitors.

> It would be a huge amount of work to find a way to create useful
> comprehensible output from a multi-physics code send directly to one
> stream (i.e. STDOUT).  In the case of CASL VERA Tiamat, the different
> physics codes actually run in parallel of each other in a block Jacobi
> black-box solve so even the output from the root rank of each physics
> would be (and currently is) jumbled together.  

Each physics component should set a different prefix.  For linear
solvers, this might look like:

  KSPSetOptionsPrefix(ksp_phys1,"-phys1_");

  KSPSetOptionsPrefix(ksp_phys2,"-phys2_");

The monitors can then be directed to separate files, e.g.,

  -phys1_ksp_monitor phys1.log -phys2_ksp_monitor_true_residual phys2.log

I wrote this in my earlier email, so maybe something wasn't clear there?
If you want to do something much more flexible, the caller can register
their own monitors.

> I suspect that for complex multi-physics APPs like CASL VERA Tiamat,
> the best overall approach would be to get each physics module to send
> all of its output to independent files and then just print a nice
> short summary/table to STDOUT.  That is, separate files for COBRA-TF,
> MPACT, Insilico, and MOOSE/Peregrine (and in the future MAMBA) would
> be used that will be written to on the root process of the cluster for
> each of these modules.  Even this will be hard to implement because,
> for example, one would need to set up MOOSE/Peregrine to redirect all
> of its objects and code that it calls to output to a single
> std::ostream object which is given to it by Tiamat (which in this case
> will actually be an std::ofstsream object).  This std::ostream object
> needs to be created by the driver Tiamat and passed into
> MOOSE/Peregrine and then MOOSE/Peregrine needs to find a way to make
> *all* output created on its behalf send to that std::ostream object,
> including all PETSc objects it creates and calls.

If you want this coarser-grained approach, the easiest way is to use
PetscViewerASCIIGetStdout(comm_physicsX,&viewer) and configure the
viewer to send output wherever you like.  This viewer is set as an
attribute on the communicator, which provides a default PetscViewer when
one is not provided explicitly.

> That means that PETSc needs to allow users to provide an arbitrary
> output object that could in turn send its output to the right
> std::ostream object.  The same goes for Fortran code creating output
> (but would be harder to set up obviously).  

Fortran callers can use the same functions, though
PetscViewerASCIIPrintf() is not variadic so the caller must create the
formatted string first.

> Just getting these Fortran codes to redirect their output to a general
> function that we could override to send output to the underlying
> std::ostream object for their physics code will be a challenge in
> itself.  But since we control a big part of the funding all the
> Fortran codes, that would be possible.  MOOSE would be a different
> story because we have no control over MOOSE.  Therefore, MOOSE might
> be a lost cause and we might just have to turn off all its outputting
> if we want a clean STDOUT.

MOOSE and libMesh are developed by responsible people so there is no
need to control their funding, just explain the situation rationally.

> 1) A C++ driver code sets up std::ostream objects and all Trilinos and
> PETSc objects that are created redirect output to those objects.

Create a PetscViewer that behaves in this way or (more crudely) set
PetscVFPrintf.

> 2) A C driver code that calls Trilinos and PETSc (perhaps indirectly)
> redirects all output through printf(...), sprintf(...), fprintf(...)
> or some other function defined on the C side.

Does Trilinos have a C interface for this?

> 3) A Fortran code that calls Trilinos and PETSc (perhaps indirectly)
> redirects all output to some arbitrary Fortran print facilities.

This currently requires a C stub to call into the Fortran, though a
native Fortran interface could be added if someone seriously wants to
manage this from Fortran.

> However, unless someone is willing to set up and support the
> infrastructure to maintain the above examples with at least the
> main-line development versions of PETSc and Trilinos, there is no
> point in doing anything because it will be lost.  That is the first
> issue to address.  DOE program managers say that they may want better
> coordination but are they willing to pay for it?  CASL can get by as
> described above with the status quo (which is a mess) and my guess is
> that they would not want to invest a lot of money in Trilinos PETSc
> compatibility, even just outputting clean-up.  Who is going to
> actually pay for this which includes the infrastructure to maintain
> it?  Again, if it is not worth maintaining and providing a reasonable
> delivery mechanism to users, it is not worth implementing in the first
> place.  

It sounds like everything or nearly everything is already in-place, so
output may not need specialized funding.  Testing is a concern, but that
is a matter of either having a --download-teuchos for one of our build
machines and/or mocking the other interface.

I agree that any larger interoperability effort will need funding, since
it is clearly not "basic research".

> That is where the lifecycle issues including regulated backward
> compatibility become critical.  

I don't think the relevant interfaces have changed significantly since
the 1990s.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 835 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-trilinos-discussion/attachments/20131122/c0845c7f/attachment.pgp>


More information about the Petsc-trilinos-discussion mailing list