[petsc-users] DMPlex: a Mapping Array between Natural and Distributed Cell Index

Matthew Knepley knepley at gmail.com
Wed Jul 20 15:24:42 CDT 2022


On Wed, Jul 20, 2022 at 2:27 PM Bora Jeong <boraj1021 at gmail.com> wrote:

> I am not seeing DMPlexDistributeSetDefault(dm,PETSC_FALSE) after you
> create your DM from the GMSH file. Your DM is then automatically
> distributed and the natural ordering won't be a priori the one from your
> file. I don't know how Petsc distributes a DM during the load step from a
> file.
> *-> PetscSFView() gives me the same Natural SF whether I have
> DMPlexDistributeSetDefault(dm,PETSC_FALSE) after DMPlexCreateGmshFromFile()
> or not. In addition, in my opinion if the Natural SF is not defined
> properly due to the distribution, I might see Null from PetscSFView(). This
> might be previously checked with Matt thanks to the help. *
>
> a) You have here two differents ordering and distribution: the one given
> by Petsc after creating your DM from your file (which is distributed), i.e.
> before DMPlexDistribute and which is considered natural, and the one after
> DMPlexDistribute.
>
> *-> This ordering you mentioned (i.e., the ordering made after creating DM
> from mesh file but before distribution) is exactly what I want to have.
> Natural SF can show me its roots and leaves (i.e., graphs), but not the
> actual index written in the mesh file (i.e., start from 1 to number of
> total vertex, for example). Adding
> DMPlexDistributeSetDefault(dm,PETSC_FALSE) does not really change my
> Natural SF. When you mention "sequantial" ordering, does it mean the same
> order written in the mesh file, instead of the graphs (i.e., roots and
> leaves)?*
>
> b) I am not familiar with the overlap. From my understanding, even when
> you pass overlap = 0 to DMPlexDistribute, there is still an "overlap", i.e.
> points shared between procs. For your run on 2 procs, some dofs (5 dofs in
> your case) are on both procs when it comes to local Vec but only one proc
> owes these dofs concerning global Vec. That's why your natural SF shows 5
> dofs less on proc 0.
>
> *-> Thanks, makes sense. The number of stratum points I get from
> DMPlexGetDepthStratum() for vertex is the size of local Vec with 1dof on
> vertex. *
> Are you on the 'release' branch? On 'main', the creation of the natural SF
> raises an error (duplicate leave entries) when the section includes
> constraint dofs.
>
> *-> I am in the main branch of 3.17.0. It is unclear your comment to me,
> but I cannot see any duplicated leave entries in this case. There is 1dof
> on vertex. *
>
> Based on your comment, I might need to do;
> *1) Convert Natural SF's roots & leaves into the actual vertex index,
> which is written in the mesh file (i.e., from 1 to 25 in my case).*
>

If you define a scalar P1 field of one component, then the dof index will
be the vertex index.


> 2) DMPlexNaturalToGlobalBegin() and End() to pass the index from step 1)
> to global Vec
>

I don't think you need this.


> 3) DMGlobalToLocalBegin() and End() to pass the index to local Vec,
> finally.
>

I don't think you need this.

If you look at the SF, it tells you the mapping. However, you could get it
in a round about way from a Vec. You could
put the index number at each position (e.g. put 5 at index 5). Then call
GlobalToNatural(). This would tell you which
global index corresponds to each natural index.

  Thanks,

     Matt


> Does this make sense? My questions so far are actually about step 1). As
> you see from the example code, I already defined Natural SF, which gives
> roots and leaves. Then how can I convert those graphs into the actual
> index, which is written in the mesh file?
>
> Best
>
> On Tue, Jul 19, 2022 at 6:11 PM Alexis Marboeuf <marboeua at mcmaster.ca>
> wrote:
>
>> Hi Bora,
>>
>> Thanks for your file.
>>
>> I am not seeing DMPlexDistributeSetDefault(dm,PETSC_FALSE) after you
>> create your DM from the GMSH file. Your DM is then automatically
>> distributed and the natural ordering won't be a priori the one from your
>> file. I don't know how Petsc distributes a DM during the load step from a
>> file.
>> To address point after point your concerns:
>>
>>    - a) You have here two differents ordering and distribution: the one
>>    given by Petsc after creating your DM from your file (which is
>>    distributed), i.e. before DMPlexDistribute and which is considered natural,
>>    and the one after DMPlexDistribute. Your natural SF maps between
>>    these two orderings. It means that you have 10 dofs (resp. 15 dofs) on proc
>>    0 (resp. on proc 1) for both ordering. There is nothing wrong to me here.
>>    If you call "natural" the ordering from your file then you should call
>>    DMPlexDistributeSetDefault(dm,PETSC_FALSE) to disable the automatic
>>    distribution by Petsc when you load from the file. It will give you a
>>    "sequential" ordering.
>>    - b) I am not familiar with the overlap. From my understanding, even
>>    when you pass overlap = 0 to DMPlexDistribute, there is still an "overlap",
>>    i.e. points shared between procs. For your run on 2 procs, some dofs (5
>>    dofs in your case) are on both procs when it comes to local Vec but only
>>    one proc owes these dofs concerning global Vec. That's why your natural SF
>>    shows 5 dofs less on proc 0.
>>    - c) When the storage is contiguous, PetscSFGetGraph gives NULL for
>>    ilocal.
>>
>> I have an additional question constraint dofs and the natural SF. Are you
>> on the 'release' branch? On 'main', the creation of the natural SF raises
>> an error (duplicate leave entries) when the section includes constraint
>> dofs. I will submit a MR very soon about that. But it will be merged to
>> main.
>>
>> Does this help?
>>
>> Best
>> -------------------------------------------------------------------
>> Alexis Marboeuf
>> Postdoctoral fellow, Department of Mathematics & Statistics
>> Hamilton Hall room 409B, McMaster University
>> 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada
>> EMail: marboeua at mcmaster.ca
>> Tel. +1 (905) 525 9140 ext. 27031
>> -------------------------------------------------------------------
>> ------------------------------
>> *From:* Bora Jeong <boraj1021 at gmail.com>
>> *Sent:* July 19, 2022 1:05 PM
>> *To:* Alexis Marboeuf <marboeua at mcmaster.ca>
>> *Cc:* Blaise Bourdin <bourdin at mcmaster.ca>; Matthew Knepley <
>> knepley at gmail.com>; PETSc <petsc-users at mcs.anl.gov>
>> *Subject:* Re: [petsc-users] DMPlex: a Mapping Array between Natural and
>> Distributed Cell Index
>>
>> Alexis,
>> Thanks for the reply with notes. The sample code with 2d square mesh is
>> attached here. I am not sure either there is something wrong with Natural
>> SF or some limitations in my understanding. Hopefully, the latter one is
>> the thing.
>>
>> The attached sample code follows the similar procedures that you & Matt
>> have suggested;
>> - turn on DMSetUseNatural()
>> - define section with 1dof on vertex before distribution of dm
>> - manually distributing dm using DMPlexDistribute(), instead of relying
>> on automatic distribution (e.g., DMSetFromOptions())
>> - access graphs of Natural SF via PetscSFGetGraph()
>>
>> What I want to achieve;
>> 1) define my own local array "A" that maps local vertex index to natural
>> order, for example,
>> A(local vertex index) = natural order of that local vertex.
>> Here the "local vertex index" is obtained from DMPlexGetDepthStratum()
>> with zero depth stratum, and mapped into 1 to number of vertex that each
>> processor owns, for simplicity.
>> Also, the "natural order" means the same indexing written in the mesh
>> file.
>> *2) Indeed, my inquiry is related to converting the vertex stratum from
>> DMPlexGetDepthStratum() to the natural vertex index, which is written in
>> the mesh file. *
>>
>> The problems, I have, are;
>> a) I am a bit lost to interpret the meaning of values owned by the
>> Natural SF. In the attached sample square mesh file, there are 25 vertex
>> points. In the attached monitor file, Natural SF owns 10 roots & leaves for
>> proc=0, and 15 roots & leaves for proc=1. In this case, how can I map these
>> "local" and "remote" arrays into the natural vertex index?
>> b) related to item a), DMPlexGetDepthStratum() is saying each processor
>> owns 15 vertex points if np=2 without overlap. But Natural SF gives only 10
>> roots & leaves for proc=0. Is there anything wrong? or did I misunderstand
>> something?
>> c) iroots pointer from PetscSFGetGraph() seems to have garbage data. By
>> any chance, is it related to Fortran?
>>
>> Best
>>
>> On Tue, Jul 19, 2022 at 11:01 AM Alexis Marboeuf <marboeua at mcmaster.ca>
>> wrote:
>>
>> I am unfortunately unable to access the attached code.
>>
>> Bora: have you defined constraint dofs in your section? In that case the
>> natural SF is wrong, and you can notice duplicate entries for the roots if
>> I remember correctly. I'll do a MR soon about that.
>> The natural ordering and distribution is the one implemented by the DM
>> before you call DMPlexDistribute. If your DM is automatically distributed
>> when you load it from your GMSH file thanks to DMPlexCreateFromFile for
>> example, you're not going to find your file ordering in the natural SF. You
>> may call DMPlexDistributeSetDefault(dm,PETSC_FALE) to disable the automatic
>> distribution.
>>
>> Best
>> -------------------------------------------------------------------
>> Alexis Marboeuf
>> Postdoctoral fellow, Department of Mathematics & Statistics
>> Hamilton Hall room 409B, McMaster University
>> 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada
>> EMail: marboeua at mcmaster.ca
>> Tel. +1 (905) 525 9140 ext. 27031
>> -------------------------------------------------------------------
>> ------------------------------
>> *From:* Blaise Bourdin <bourdin at mcmaster.ca>
>> *Sent:* July 18, 2022 8:52 PM
>> *To:* Bora Jeong <boraj1021 at gmail.com>
>> *Cc:* Matthew Knepley <knepley at gmail.com>; PETSc <petsc-users at mcs.anl.gov>;
>> Alexis Marboeuf <marboeua at mcmaster.ca>
>> *Subject:* Re: [petsc-users] DMPlex: a Mapping Array between Natural and
>> Distributed Cell Index
>>
>> Alexis noticed a problem with the natural SF when constraints are
>> defined. He has a MR coming.
>> @Alexis: Could it be that what Bora sees is exactly what you fixed?
>>
>> Blaise
>>
>> On Jul 18, 2022, at 8:35 PM, Bora Jeong <boraj1021 at gmail.com> wrote:
>>
>> Thank you for the reply.
>>
>> I am actually putting 1dof on the vertex points.
>> In the attached sample code, from line 63 to line 99, 1dof on all vertex
>> points is defined for the section. But my problem is, if I print out
>> Natural SF using the viewer, I only have 10 vertex points, instead of 15.
>> Here 15 is the size of the global vector defined by the aforementioned
>> section. The working example code was attached to the previous email with a
>> small grid.
>>
>> Best
>>
>> On Mon, Jul 18, 2022 at 6:50 PM Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>> On Mon, Jul 18, 2022 at 4:30 PM Bora Jeong <boraj1021 at gmail.com> wrote:
>>
>> Actually, it seems that my problem is that I do not properly understand
>> what Natural SF means.
>>
>> What I want to achieve is to extract natural vertex index from Natural
>> SF, and make sure this natural vertex index is the same with what is
>> written in a mesh file from Gmsh. The natural vertex index defined by Gmsh
>> is attached as a snapshot with sample code and mesh. I want to reproduce
>> this indexing based on the Natural SF that Petsc defines.
>>
>> I am struggling with that because the number of entries defined in the
>> Natural SF that I can see through PetscSFView() is different from the
>> number of vertex points that each processor actually owns. For example,
>> with two processors and with the attached "square.msh", each processor
>> might have 15 vertex points if I configure Petsc with ParMetis. However,
>> the defined Natural SF for proc=0 only has 10 roots. I expected, for each
>> processor, the Natural SF would show me 15 entries through PetscSFView(),
>> if I define 1 degree of freedom on vertex when the Petsc Section is
>> defined.
>>
>> When Petsc defines the Natural SF, are there any omitted components since
>> they are trivial to save?
>>
>>
>> The naturalSF is mapping the field defined by the Section, not the mesh.
>> You can get the effect you are asking for by putting one
>> degree of freedom (dof) on every vertex for the Section. How are you
>> defining your Section now?
>>
>>    Thanks,
>>
>>       Matt
>>
>>
>> Best
>>
>> On Mon, Jul 18, 2022 at 3:50 PM Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>> On Mon, Jul 18, 2022 at 12:14 PM Bora Jeong <boraj1021 at gmail.com> wrote:
>>
>> Thank you for the corrections. It works.
>>
>> However, I still have a problem.
>> I tested zero stratum for DMGetStratumIS() with the default depth label,
>> since the graph of vertex is needed. Then, PetscSFGetGraph() from the
>> Natural SF is crashed. I checked that pBcCompIS contains proper set of
>> integers via ISView().
>>
>> On the other hand, it works okay if DMGetStratumIS() is with dim stratum,
>> which is for cell stratum.
>> The problem is that if the dim stratum is used, the number of entry in
>> ileaves(i)% pointer does not match with the actual number of vertex that
>> each processor has. That is why I tried to put the zero stratum on
>> DMGetStratumIS(), instead of the stratum for cell (i.e., "dim") to see if
>> any changes on the graph. Can I get comments?
>>
>>
>> I cannot understand your question. I am not sure what you are using the
>> set of vertices or cell for.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>> Best
>>
>> On Sun, Jul 17, 2022 at 9:59 AM Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>> On Fri, Jul 15, 2022 at 7:05 PM Bora Jeong <boraj1021 at gmail.com> wrote:
>>
>> I found that iroots() and ileaves() have size of nleaves and tested
>> through the attached sample code with the grid previously shared. Then I
>> get the results written in the attached monitor file.
>>
>> It seems ileaves(i)%rank and ileaves(i)%index at line 127 of the attached
>> code have garbage values, different from displayed values by PetscSFView()
>> at line 113.
>>
>> It is tough to get it why the garbage values are returned from
>> PetscSFGetGraph(). Any comments will be very appreciated.
>>
>>
>> Unfortunately, Fortran is not very good at reporting declaration errors.
>> The problem is that you did not include or use the Vec module. I have done
>> this and your example runs for me. I have included the modified code.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>> Best
>>
>> On Fri, Jul 15, 2022 at 3:53 PM Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>> On Fri, Jul 15, 2022 at 2:25 PM Bora Jeong <boraj1021 at gmail.com> wrote:
>>
>> Thank you for the comments. Updated sample code is attached here;
>> DMPlexDistribute() is explicitly called and by defining a section before
>> mesh distribution, a Natural SF was able to be defined as found from
>> PetscSFView().
>>
>> However, I am still struggling to call PetscSFGetGraph()
>> https://petsc.org/main/docs/manualpages/PetscSF/PetscSFGetGraph/
>> due to data type definition of ilocal and iremote. What is the proper
>> size allocation for those two variables? Does this size for the number of
>> processors? or the number of vertex?
>>
>>
>> Since we need to pass back arrays, you need to pass us in F90 pointers.
>> Here is an example of doing such a thing:
>>
>>
>> https://gitlab.com/petsc/petsc/-/blob/main/src/vec/is/sf/tutorials/ex1f.F90#L94
>>
>>   Thanks,
>>
>>       Matt
>>
>>
>> Best
>>
>> On Fri, Jul 15, 2022 at 9:09 AM Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>> On Fri, Jul 15, 2022 at 8:46 AM Bora Jeong <boraj1021 at gmail.com> wrote:
>>
>> Okay, I got it what's happening. First, this very bright functionality of
>> petsc (the natural-global converting) needs to be documented in a better
>> way for sure. Currently, it is very difficult to use/follow this features
>> as an external user. Hope this will move forward in a better way.
>>
>> Then next question is if I load/distribute mesh just like I am doing
>> right now shown in the sample code, can I assume that my code does not
>> create natural sf during the distribution(always)? In other words, can I
>> always get the natural order of each node by simply stacking the preceding
>> processor's number of node? For example, for proc=0, natural node ID might
>> be just 1 to nnode_proc_0,
>> for proc=1, it might be {nnode_proc_0 + 1 to nnode_proc_0 + nnode_proc_1}
>> and so on.
>>
>> Does that always make sense in this case?
>>
>>
>> No, but if you call DMPlexDistribute() yourself, rather than having it
>> called automatically by DMSetFromOptions(), you can
>> preserve the mapping:
>>
>>   https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexDistribute/
>>
>> The SF returned maps the original point distribution, which is in the
>> same order as the file, to the redistributed points, which are determined
>> by the mesh partitioner.
>>
>>   Thanks,
>>
>>       Matt
>>
>>
>> Best
>>
>>
>> On Fri, Jul 15, 2022 at 8:07 AM Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>> On Fri, Jul 15, 2022 at 7:17 AM Bora Jeong <boraj1021 at gmail.com> wrote:
>>
>> A sample code for loading dm and declaring Natural SF from a sphere mesh
>> is attached here. PetscSFView() returns NULL from sf_nat.
>>
>>
>> The Global-to-Natural mapping relates global dofs to natural dofs. Thus,
>> in order to compute this mapping the DM has to know
>> about the dof layout before distribution. This means you need to setup a
>> local section before distributing, as we do for example in
>> Plex test ex15. This makes things more complicated since everything
>> cannot be packaged up into DMSetFromOptions(), but we need
>> user input so I do not see a simpler way to do things.
>>
>>   Thanks,
>>
>>     Matt
>>
>>
>> Best
>>
>> On Fri, Jul 15, 2022 at 6:39 AM Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>> On Thu, Jul 14, 2022 at 8:25 PM Bora Jeong <boraj1021 at gmail.com> wrote:
>>
>> Okay, I checked it and you are correct. In my case, simply, natural node
>> index can be identified by stacking all the preceding processor's numbers
>> of nodes for a particular processor, which is good due to simplicity.
>> However, one serious question is why this is happening in my code? In other
>> words, why the natural SF is not created during the mesh distribution? My
>> code wants to have consistency in dealing with this natural indexing for
>> several different kinds of mesh files. So there is a necessity to guarantee
>> of consistency in this weird behavior.
>>
>>
>> I can't tell what is going on in your code unless I can run it. Do you
>> have a simple example?
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>> Best,
>>
>> On Thu, Jul 14, 2022 at 6:43 PM Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>> On Thu, Jul 14, 2022 at 5:47 PM Bora Jeong <boraj1021 at gmail.com> wrote:
>>
>> Thank you for the comments. I have these errors when I call PetscSFView()
>> after DMGetNaturalSF()
>>
>> [2]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------
>> [2]PETSC ERROR: Null argument, when expecting valid pointer
>> [2]PETSC ERROR: Null Pointer: Parameter # 1
>> [2]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
>> [2]PETSC ERROR: Petsc Release Version 3.17.0, unknown
>> [2]PETSC ERROR: #1 PetscSFView() at
>> [2]PETSC ERROR: #2 User provided function() at User file:0
>> Abort(85) on node 2 (rank 0 in comm 16): application called
>> MPI_Abort(MPI_COMM_SELF, 85) - process 0
>>
>>
>> Clearly NULL was returned, which means no natural SF was created.
>>
>>
>> Below is the structure to load a mesh file from gmsh;
>>
>>   call DMCreate(PETSC_COMM_WORLD, dm, ierr);CHKERRA(ierr)
>>   call DMSetType(dm, plex, ierr);CHKERRA(ierr)
>>   call DMSetUseNatural(dm, PETSC_TRUE, ierr);CHKERRA(ierr)
>>   call DMSetFromOptions(dm, ierr);CHKERRA(ierr)
>>   call DMGetNaturalSF(dm, sf_nat, ierr);CHKERRA(ierr)
>>   call PetscSFView(sf_nat, PETSC_VIEWER_STDOUT_WORLD, ierr);CHKERRA(ierr)
>>
>>
>> The natural SF is created during mesh distribution. That has not happened
>> here. This means that
>> the order of cells is identical to the file it was read from.
>>
>>   Thanks,
>>
>>       Matt
>>
>>
>> Best
>>
>> On Thu, Jul 14, 2022 at 10:49 AM Matthew Knepley <knepley at gmail.com>
>> wrote:
>>
>> On Wed, Jul 13, 2022 at 10:17 PM Bora Jeong <boraj1021 at gmail.com> wrote:
>>
>> Dear petsc team,
>>
>> I am a user of DMPlex for a finite volume code and there is a necessity
>> to know global index of each cell. Here the global index means the indexing
>> that can be found from a mesh file itself without distribution over
>> processors. It seems petsc community denotes this indexing term as
>> "natural".
>>
>> What I want to do is to create a local array (not petsc vector but just
>> an array variable in the program) to map distributed cell ID to natual cell
>> ID, for example, an array "A";
>> A(distributed_node_ID) = natural_node_ID
>>
>> There are some petsc functions to support mapping between global and
>> natural vectors. However, I just need to define the array "A" as above
>> example. To achieve this, what is a proper/smart way? In other words, how
>> can I extract the natural_cell_ID from a distributed local_cell_ID?
>>
>> I turned on DMSetUseNatural(DM, PETSC_TRUE) before distribution, but
>> after this, defining all the required section and star forest objects to
>> get natural and global vectors seems not that direct way for my purpose,
>> which is just to extract the above mapping array "A". Can I get any
>> comments about it?
>>
>>
>> There is only one thing created, the sfNatural PetscSF object, which you
>> can get with DMGetNaturalSF(). The roots of this SF are
>> the global numbers of dofs stored in PETSc vectors, and the leaves are
>> natural numbers for these dofs. Thus, when we map global
>> vectors to natural vectors in DMPlexGlobalToNaturalBegin/End(), we
>> call PetscSFBcastBegin/End(). Mapping natural to global we call
>> PetscSFReduceBegin/End(). You could pull the information out of the SF
>> using PetscSFGetGraph() if you want.
>>
>>   Thanks,
>>
>>     Matt
>>
>>
>> Regards
>> Mo
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>> <test.F90>
>>
>>
>>>> Tier 1 Canada Research Chair in Mathematical and Computational Aspects of
>> Solid Mechanics
>> Professor, Department of Mathematics & Statistics
>> Hamilton Hall room 409A, McMaster University
>> 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada
>> https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243
>>
>>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220720/4eea06de/attachment-0001.html>


More information about the petsc-users mailing list