[petsc-users] Preallocation (Unstructured FE)

Matthew Knepley knepley at gmail.com
Tue Aug 2 15:40:47 CDT 2011


On Tue, May 17, 2011 at 4:53 PM, Tabrez Ali <stali at geology.wisc.edu> wrote:

> **
> Matt
>
> Any linear 2/3d element would do. A linear tetrahedron would be good to
> start with.
>

I have finally gotten to this. Sorry about the wait. You must have
petsc-dev. I have added

  src/dm/impls/mesh/examples/tutorials/ex4f90.F

I coded it using the Fortran datatypes option. If you do not use his, just
change

  type(Mat) --> Mat

and so on. It should work fine. It reads in an ExodusII mesh (I attach the
one I used),
and then defines a linear element. It prints out the properly allocated
matrix. Let me know
if this helps.

  Thanks,

     Matt


> Thanks
> Tabrez
>
>
> On 05/17/2011 11:21 AM, Matthew Knepley wrote:
>
> On Tue, May 17, 2011 at 9:36 AM, Tabrez Ali <stali at geology.wisc.edu>wrote:
>
>>  Matt
>>
>> An example which reads an unstructured mesh (ascii or exodus), partitions
>> it and sets up Mat (with the right preallocation) would be great. If it is
>> in Fortran then that would be perfect. Btw my problem also has Lagrange
>> multiplier constraints.
>>
>
>  Yes, I should have been more specific. What element?
>
>    Thanks,
>
>       Matt
>
>
>>  Thanks
>>  Tabrez
>>
>>
>> On 05/17/2011 08:25 AM, Matthew Knepley wrote:
>>
>> On Mon, May 2, 2011 at 8:16 AM, Tabrez Ali <stali at geology.wisc.edu>wrote:
>>
>>>  Is there a way I can use this and other mesh routines from Fortran? The
>>> manual doesn't say much on this.
>>>
>>
>>  Yes, but you are right that nothing is in the manual. DMMESH (in
>> petsc-dev) now obeys the full DM interface,
>> so that DMGetMatrix() will return you a properly allocated Mat. So what is
>> the problem? Of course, it is that
>> Petsc has no good way to specify what finite element you are dealing with.
>>
>>  The way I was doing this is to encode it using some C++ classes. This
>> turns out to be a bad way to do things.
>>  I am currently reworking it so that this information is stored in a
>> simple C struct that you can produce. Should
>> have this done soon.
>>
>>  Can you mail me a description of an example you would like to run?
>>
>>    Thanks,
>>
>>       Matt
>>
>>
>>>
>>> Tabrez
>>>
>>>
>>> On 05/01/2011 09:53 AM, Matthew Knepley wrote:
>>>
>>> On Sat, Apr 30, 2011 at 12:58 PM, Tabrez Ali <stali at geology.wisc.edu>wrote:
>>>
>>>> Petsc Developers/Users
>>>>
>>>> I having some performance issues with preallocation in a fully
>>>> unstructured FE code. It would be very helpful if those using FE codes can
>>>> comment.
>>>>
>>>> For a problem of size 100K nodes and 600K tet elements (on 1 cpu)
>>>>
>>>> 1. If I calculate the _exact_ number of non-zeros per row (using a
>>>> running list in Fortran) by looping over nodes & elements, the code takes 17
>>>> mins (to calculate nnz's/per row, assemble and solve).
>>>> 2. If I dont use a running list and simply get the average of the max
>>>> number of nodes a node might be connected to (again by looping over nodes &
>>>> elements but not using a running list) then it takes 8 mins
>>>> 3. If I just magically guess the right value calculated in 2 and use
>>>> that as average nnz per row then it only takes 25 secs.
>>>>
>>>> Basically in all cases Assembly and Solve are very fast (few seconds)
>>>> but the nnz calculation itself (in 2 and 3) takes a long time. How can this
>>>> be cut down? Is there a heuristic way to estimate the number (as done in 3)
>>>> even if it slightly overestimates the nnz's per row or are efficient ways to
>>>> do step 1 or 2. Right now I have do i=1,num_nodes; do j=1,num_elements ...
>>>> which obviously is slow for large number of nodes/elements.
>>>>
>>>
>>>  If you want to see my code doing this, look at
>>>
>>>    include/petscdmmesh.hh:preallocateOperatorNew()
>>>
>>>  which handles the determination of nonzero structure for a FEM
>>> operator. It should look mostly
>>> like your own code.
>>>
>>>      Matt
>>>
>>>
>>>> Thanks in advance
>>>>  Tabrez
>>>>
>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>


-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110802/631a839e/attachment-0001.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Square.gen
Type: application/octet-stream
Size: 2852 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110802/631a839e/attachment-0001.obj>


More information about the petsc-users mailing list