[petsc-users] Matrix stored on many machines, filled by one process

Bartłomiej W bartlomiej.wach at yahoo.pl
Thu May 19 02:54:22 CDT 2011


Thank You for Your answers.

I have another question

I wanted to try a 64 bit system out and got the ubuntu with 2 GB RAM and additional 11 GB swap and still I get memory allocation error like this:

[0] Maximum memory PetscMalloc()ed 2,514,564,304 OS cannot compute size of entire process

Do I miss something when I compile the binaries?

Regards

--- 18.5.11 (Śr), Barry Smith <bsmith at mcs.anl.gov> napisał(a):

Od: Barry Smith <bsmith at mcs.anl.gov>
Temat: Re: [petsc-users] Matrix stored on many machines, filled by one process
Do: "PETSc users list" <petsc-users at mcs.anl.gov>
Data: 18 Maj 2011 (Środa), 23:32


On May 18, 2011, at 4:05 PM, Matthew Knepley wrote:

> On Wed, May 18, 2011 at 3:54 PM, Bartłomiej W <bartlomiej.wach at yahoo.pl> wrote:
> Hello again,
> 
> I am wondering if it is possibile to have a matrix be stored across several machines but have only one process to set its values? Can I just create the matrix and then simply setownership range for process 0 to full size of the matrix and zero for others?
> 
> You can create a parallel matrix, and then only set values from 1 process. Its not efficient, but it will work.

   For big problems it will be terribly slow, so slow as to be idiotic.

    Note in our users manual introduction we specifically say 

PETSc should not be used to attempt to provide a “parallel linear solver” in an otherwise sequential
code. Certainly all parts of a previously sequential code need not be parallelized but the matrix
generation portion must be to expect any kind of reasonable performance. Do not expect to generate
your matrix sequentially and then “use PETSc” to solve the linear system in parallel.


> 
>    Matt
>  
> Thank You for an answer.
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110519/5deb0afe/attachment-0001.htm>


More information about the petsc-users mailing list