multiple rhs
Barry Smith
bsmith at mcs.anl.gov
Fri Mar 13 20:00:43 CDT 2009
On Mar 12, 2009, at 8:11 PM, David Fuentes wrote:
>
> I'm getting plapack errors in "external library" with
>
> MatMatMult_MPIDense_MPIDense
>
> with plapack? How is memory handled for a matrix
> of type MATMPIDENSE? Are all NxN entries allocated and ready for
> use at time of creation?
Yes, it has all zeros in it.
> or do I have to MatInsertValues
> then Assemble to be ready to use a matrix?
>
>
>
>
>
> [0]PETSC ERROR: --------------------- Error Message
> ----------------------------
> --------
> [0]PETSC ERROR: Error in external library!
> [1]PETSC ERROR: [0]PETSC ERROR: --------------------- Error Message
> ------------
> ------------------------
> [1]PETSC ERROR: Error in external library!
> Due to aparent bugs in PLAPACK,this is not currently supported!
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
I could not get plapack to properly handle nonsquare matrices
properly for these operations.
I tried hard to debug but plapack is a mess of complexity, got very
frustrated. If you take out the
generation of this error message, so the code runs, then you can try
to debug plapack
(I am pretty sure the problem is in plapack, not in the PETSc
interface).
Barry
>
> [1]PETSC ERROR: Due to aparent bugs in PLAPACK,this is not currently
> supported!
> [1]PETSC ERROR:
> ----------------------------------------------------------------
> --------
> [1]PETSC ERROR: Petsc Release Version 3.0.0, Patch 4, Fri Mar 6
> 14:46:08 CST 20
> 09
> [1]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: [1]PETSC ERROR: See docs/faq.html for hints about
> trouble shooting.
>
>
>
>
>
> On Thu, 12 Mar 2009, Hong Zhang wrote:
>
>>
>>>> Is MatCreateMPIDense the recommended matrix type to interface w/
>>>> mumps ?
>>>> Does it use a sparse direct storage or allocate the full n x n
>>>> matrix?
>>> No, MUMPS is "sparse direct" so it uses MPIAIJ.
>>
>> For mpi dense matrix, you can use plapack
>>
>> Hong
>>>> df
>>>> On Thu, 12 Mar 2009, Matthew Knepley wrote:
>>>>
>>>> You can try using a sparse direct solver like MUMPS instead of
>>>> PETSc LU.
>>>>>
>>>>> Matt
>>>>> On Thu, Mar 12, 2009 at 9:17 AM, David Fuentes <fuentesdt at gmail.com
>>>>> >
>>>>> wrote:
>>>>>
>>>>> Thanks Hong,
>>>>>> The complete error message is attached. I think I just had too
>>>>>> big
>>>>>> of a matrix. The matrix i'm trying to factor is 327680 x 327680
>>>>>> [0]PETSC ERROR: --------------------- Error Message
>>>>>> ------------------------------------
>>>>>> [0]PETSC ERROR: Out of memory. This could be due to allocating
>>>>>> [0]PETSC ERROR: too large an object or bleeding by not properly
>>>>>> [0]PETSC ERROR: destroying unneeded objects.
>>>>>> [0]PETSC ERROR: Memory allocated 2047323584 Memory used by
>>>>>> process
>>>>>> 2074058752
>>>>>> [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log
>>>>>> for info.
>>>>>> [0]PETSC ERROR: Memory requested 1258466480!
>>>>>> [0]PETSC ERROR:
>>>>>> ------------------------------------------------------------------------
>>>>>> [0]PETSC ERROR: Petsc Release Version 3.0.0, Patch 2, Wed Jan
>>>>>> 14 22:57:05
>>>>>> CST 2009
>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble
>>>>>> shooting.
>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages.
>>>>>> [0]PETSC ERROR:
>>>>>> ------------------------------------------------------------------------
>>>>>> [0]PETSC ERROR: ./RealTimeImaging on a gcc-4.1.2 named DIPWS019
>>>>>> by
>>>>>> dfuentes
>>>>>> Wed Mar 11 20:30:37 2009
>>>>>> [0]PETSC ERROR: Libraries linked from
>>>>>> /usr/local/petsc/petsc-3.0.0-p2/gcc-4.1.2-mpich2-1.0.7-dbg/lib
>>>>>> [0]PETSC ERROR: Configure run at Sat Jan 31 06:53:09 2009
>>>>>> [0]PETSC ERROR: Configure options --download-f-blas-
>>>>>> lapack=ifneeded
>>>>>> --with-mpi-dir=/usr/local --with-matlab=1 --with-matlab-engine=1
>>>>>> --with-matlab-dir=/usr/local/matlab2007a --CFLAGS=-fPIC --with-
>>>>>> shared=0
>>>>>> [0]PETSC ERROR:
>>>>>> ------------------------------------------------------------------------
>>>>>> [0]PETSC ERROR: PetscMallocAlign() line 61 in src/sys/memory/
>>>>>> mal.c
>>>>>> [0]PETSC ERROR: PetscTrMallocDefault() line 194 in src/sys/
>>>>>> memory/mtr.c
>>>>>> [0]PETSC ERROR: PetscFreeSpaceGet() line 14 in src/mat/utils/
>>>>>> freespace.c
>>>>>> [0]PETSC ERROR: MatLUFactorSymbolic_SeqAIJ() line 381 in
>>>>>> src/mat/impls/aij/seq/aijfact.c
>>>>>> [0]PETSC ERROR: MatLUFactorSymbolic() line 2289 in
>>>>>> src/mat/interface/matrix.c
>>>>>> [0]PETSC ERROR: KalmanFilter::DirectStateUpdate() line 456 in
>>>>>> unknowndirectory/src/KalmanFilter.cxx
>>>>>> [0]PETSC ERROR: GeneratePRFTmap() line 182 in
>>>>>> unknowndirectory/src/MainDriver.cxx
>>>>>> [0]PETSC ERROR: main() line 90 in unknowndirectory/src/
>>>>>> MainDriver.cxx
>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 55) - process
>>>>>> 0[unset]:
>>>>>> aborting job:
>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 55) - process 0
>>>>>> On Thu, 12 Mar 2009, Hong Zhang wrote:
>>>>>>
>>>>>> David,
>>>>>>> I do not see any problem with the calling sequence.
>>>>>>> The memory is determined in MatLUFactorSymbolic().
>>>>>>> Does your code crashes within MatLUFactorSymbolic()?
>>>>>>> Please send us complete error message.
>>>>>>> Hong
>>>>>>> On Wed, 11 Mar 2009, David Fuentes wrote:
>>>>>>>
>>>>>>> Hello,
>>>>>>>> I have a sparse matrix, A, with which I want to solve
>>>>>>>> multiple right
>>>>>>>> hand
>>>>>>>> sides
>>>>>>>> with a direct solver. Is this the correct call sequence ?
>>>>>>>>
>>>>>>>> MatGetFactor(A,MAT_SOLVER_PETSC,MAT_FACTOR_LU,&Afact);
>>>>>>>> IS isrow,iscol;
>>>>>>>> MatGetOrdering(A,MATORDERING_ND,&isrow,&iscol);
>>>>>>>> MatLUFactorSymbolic(Afact,A,isrow,iscol,&info);
>>>>>>>> MatLUFactorNumeric(Afact,A,&info);
>>>>>>>> MatMatSolve(Afact,B,X);
>>>>>>>> my solve keeps running out of memory
>>>>>>>> "[0]PETSC ERROR: Memory requested xxx!"
>>>>>>>> is this in bytes? I can't tell if the problem I'm trying to
>>>>>>>> solve
>>>>>>>> is too large form my machine or if I just have bug in the call
>>>>>>>> sequence.
>>>>>>>> thank you,
>>>>>>>> David Fuentes
>>>>> --
>>>>> What most experimenters take for granted before they begin their
>>>>> experiments
>>>>> is infinitely more interesting than any results to which their
>>>>> experiments
>>>>> lead.
>>>>> -- Norbert Wiener
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments
>>> is infinitely more interesting than any results to which their
>>> experiments
>>> lead.
>>> -- Norbert Wiener
>>
More information about the petsc-users
mailing list