[petsc-users] local row calculation in 3D

ilyas ilyas ilyascfd at gmail.com
Sun Apr 24 07:35:25 CDT 2011


Thank you Randall,
I guess I will follow the Jed's and Matt's suggestions.

Ilyas.

2011/4/19 Randall Mackie <rlmackie862 at gmail.com>

> You are right! I just didn't read all the way to the end of your email.
> Sorry about that.
> So here is a little more code that does it correctly:
>
>       PetscInt, pointer :: ltog(:)
>
>       call DAGetGlobalIndicesF90(da,nloc,ltog,ierr); CHKERRQ(ierr)
>
>
>       do kk=zs,zs+zm-1
>         do jj=ys,ys+ym-1
>           do ii=xs,xs+xm-1
>
>             row=ii-gxs + (jj-gys)*gxm + (kk-gzs)*gxm*gym
>             grow=ltog(3*row + 1)
>
> [all your code here]
>
>               call MatSetValues(A,i1,grow,ic,col,v,INSERT_VALUES,
>      .             ierr); CHKERRQ(ierr)
>
> [more code here]
>
>       call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr); CHKERRQ(ierr)
>       call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr); CHKERRQ(ierr)
>
>
>
> Hope this is a little more helpful. As Jed points out, there are other ways
> to do the same
> thing (and probably more efficiently than what I've outlined here).
>
> Randy M.
>
>
>
> On Tue, Apr 19, 2011 at 12:00 AM, ilyas ilyas <ilyascfd at gmail.com> wrote:
>
>> Hi Randy,
>>
>> Thank you for your answer.
>>
>> I have already done it. You can see it in my first e-mail.
>>
>> It does not work properly for all number of processors.
>> For certain number of processors, it works correctly,
>> not for all number of processors.
>> For example, for 1,2,or 3 processors, it's ok.
>> For 4 processors, it gives wrong location, so on.
>> "Problem" occurs in 3rd dimension ( (kk-gzs)*gxm*gym )
>>
>> Here is another suggestion (I have not tried yet) ;
>>
>>        do kk=zs,zs+zm-1
>>         do jj=ys,ys+ym-1
>>           do ii=xs,xs+xm-1
>>
>>             row=ii-gxs + (jj-gys)*MX + (kk-gzs)*MX*MY
>>
>> MX,MY,MZ are global dimensions.This is also what I do serially
>>
>> Do you think that it is correct or any other suggestions?
>>
>> Regards,
>> Ilyas.
>>
>> 2011/4/18 Randall Mackie <rlmackie862 at gmail.com>
>>
>>> Here's how I do it:
>>>
>>>        do kk=zs,zs+zm-1
>>>         do jj=ys,ys+ym-1
>>>           do ii=xs,xs+xm-1
>>>
>>>              row=ii-gxs + (jj-gys)*gxm + (kk-gzs)*gxm*gym
>>>
>>>
>>> Good luck,
>>>
>>> Randy M.
>>>
>>>
>>>
>>> On Mon, Apr 18, 2011 at 6:54 AM, ilyas ilyas <ilyascfd at gmail.com> wrote:
>>>
>>>> Hi,
>>>> Thank you for your suggestion. I will take it into account.
>>>> Since changing this structure in my "massive" code may take  too much
>>>> time,
>>>> I would like to know that how "row" is calculated in 3D, independently
>>>> from processor numbers.
>>>>
>>>> Regards,
>>>> Ilyas
>>>>
>>>> 2011/4/18 Matthew Knepley <knepley at gmail.com>
>>>>
>>>>> On Mon, Apr 18, 2011 at 8:34 AM, ilyas ilyas <ilyascfd at gmail.com>wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> In ex14f.F in KSP, "row" variable is calculated either
>>>>>>
>>>>>
>>>>> These are very old. I suggest you use the FormFunctionLocal() approach
>>>>> in ex5f.F which
>>>>> does not calculate global row numbers when using a DA.
>>>>>
>>>>>    Matt
>>>>>
>>>>>
>>>>>> 349: do 30 j=ys,ys+ym-1
>>>>>> 350: ...
>>>>>> 351: do 40 i=xs,xs+xm-1
>>>>>> 352:          row = i - gxs + (j - gys)*gxm + 1
>>>>>>
>>>>>> or
>>>>>>
>>>>>> 442: do 50 j=ys,ys+ym-1
>>>>>> 443: ...
>>>>>> 444: row = (j - gys)*gxm + xs - gxs
>>>>>> 445: do 60 i=xs,xs+xm-1
>>>>>> 446:          row = row + 1
>>>>>>
>>>>>> How can I calculate "row" in 3D ?
>>>>>>
>>>>>> I tried this;
>>>>>>
>>>>>> do k=zs,zs+zm-1
>>>>>>    do j=ys,ys+ym-1
>>>>>>       do i=xs,xs+xm-1
>>>>>>
>>>>>>            row = i - gxs + (j - gys)*gxm + (k - gzs)*gxm*gym + 1
>>>>>>
>>>>>> It does not work for certain number of processors.
>>>>>>
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> Ilyas
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> What most experimenters take for granted before they begin their
>>>>> experiments is infinitely more interesting than any results to which their
>>>>> experiments lead.
>>>>> -- Norbert Wiener
>>>>>
>>>>
>>>>
>>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110424/cd7e2561/attachment.htm>


More information about the petsc-users mailing list