Petsc questions
Matthew Knepley
knepley at gmail.com
Thu Aug 14 13:33:35 CDT 2008
On Thu, Aug 14, 2008 at 1:22 PM, Nguyen, Hung V ERDC-ITL-MS
<Hung.V.Nguyen at usace.army.mil> wrote:
>
> Barry,
>
> Thanks for the info. I will use what we already have. How's about estimation
> of matrix condition number via PETSC?
You can use -ksp_monitor_singular_value to print out the exterior
singular values
for the Hessenberg matrix generated by the Krylov method. I don't think we have
anything else. Condition number estimation is a tough problem.
Matt
> -Hung
>
> -----Original Message-----
> From: owner-petsc-users at mcs.anl.gov [mailto:owner-petsc-users at mcs.anl.gov] On
> Behalf Of Barry Smith
> Sent: Thursday, August 14, 2008 11:08 AM
> To: petsc-users at mcs.anl.gov
> Subject: Re: Petsc questions
>
>
> Check AO functions. They renumber between two different orderings.
>
> But, if your code already puts everything in the new numbering, you can just
> use PETSc with that new numbering and leave your ordering code as it is.
>
> Barry
>
> On Aug 14, 2008, at 9:37 AM, Nguyen, Hung V ERDC-ITL-MS wrote:
>
>>
>> Hello All,
>>
>> I am new to PETSC. I have following questions about PETSC:
>>
>> 1. Do PETSC functions help to estimate/calculate a matrix condition
>> number?
>> If yes, can I get the info how to do it?
>>
>> 2. A question about renumbering nodes: our CFD code uses ParMetis to
>> compute the original partitioning of the mesh. The global nodes are
>> renumbered consecutively within each Parmetis partition as npetsc
>> which is a mapping vector from the original global node numbering to
>> the new numbering, see below as a test code. My question is whether
>> PETSC function helps to renumber from ParMetis partition to PETSC
>> partition or not?
>>
>> Thank for your help.
>>
>> Regards,
>>
>> -Hung
>> -- code:
>> ! Read the data.
>>
>> fname = 'petsc.dat'
>> call parnam (fname)
>> open (2, file = fname, status = 'old')
>>
>> ! No. global nodes, local nodes, owned nodes,
>> ! compressed columns, PEs.
>>
>> read (2, '(5i10)') ng, nloc, nown, ncol, npes
>>
>> if (noproc .ne. npes) then
>> if (myid .eq. 0) then
>> print*, 'Number of PEs from the data file does not match',
>> & ' the number from the run command.'
>> end if
>> call PetscFinalize (ierr)
>> stop
>> end if
>>
>> ! Local node array containing global node numbers.
>>
>> allocate (nglobal(nloc))
>>
>> read (2, '(8i10)') nglobal
>>
>> ! Find petsc numbering scheme.
>>
>> allocate (nown_all(noproc))
>> allocate (idisp(noproc))
>> allocate (npetsc(nloc))
>> allocate (nodes1(ng))
>> allocate (nodes2(ng))
>>
>> call MPI_ALLGATHER (nown, 1, MPI_INTEGER, nown_all, 1,
>> & MPI_INTEGER, PETSC_COMM_WORLD, ierr)
>>
>> idisp(1) = 0
>> do i = 2, noproc
>> idisp(i) = idisp(i - 1) + nown_all(i - 1)
>> end do
>>
>> call MPI_ALLGATHERV (nglobal, nown, MPI_INTEGER, nodes1,
>> & nown_all, idisp, MPI_INTEGER, PETSC_COMM_WORLD, ierr)
>>
>> do i = 1, ng
>> ii = nodes1(i)
>> nodes2(ii) = i
>> end do
>>
>> ! Process the local nodes for their petsc numbers.
>>
>> do i = 1, nloc
>> ii = nglobal(i)
>> npetsc(i) = nodes2(ii)
>> end do
>>
>>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which
their experiments lead.
-- Norbert Wiener
More information about the petsc-users
mailing list