[petsc-users] 2 levels of parallelism for ASM

Raeth, Peter PRaeth at hpti.com
Mon Jan 24 08:13:51 CST 2011


* blush *    Thought this was an internal list.    Looking forward to the PETSC user's group discussion.

Peter G. Raeth, Ph.D.
Senior Staff Scientist
Signal and Image Processing
High Performance Technologies, Inc
937-904-5147
praeth at hpti.com

________________________________________
From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] on behalf of Raeth, Peter [PRaeth at hpti.com]
Sent: Monday, January 24, 2011 9:10 AM
To: PETSc users list
Subject: Re: [petsc-users] 2 levels of parallelism for ASM

Hello Thomas.

Am only just now learning how to use PETSc. My particular concern is calculating the Kronecker Tensor Product. Running out of memory is the biggest roadblock, even though I think I have four times the required memory available and am destroying matrices as they are no longer needed.

Would like to lurk on your thread as a learning exercise. Do you want to start a thread on PETSc's users' group?


Best,

Peter.

Peter G. Raeth, Ph.D.
Senior Staff Scientist
Signal and Image Processing
High Performance Technologies, Inc
937-904-5147
praeth at hpti.com

________________________________________
From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] on behalf of DUFAUD THOMAS [thomas.dufaud at univ-lyon1.fr]
Sent: Monday, January 24, 2011 9:09 AM
To: petsc-users at mcs.anl.gov
Subject: [petsc-users] 2 levels of parallelism for ASM

Hi,
I noticed that the local solution of an ASM preconditioner is performed on a single processor per domain, usually setting a KSP PREONLY to perform an ILU factorization.
I would like to perform those local solution with a krylov method (GMRES) among a set of processors.

Is it possible, for an ASM preconditioner, to set a subgroup of processors per domain and then define parallel sub-solver over a sub-communicator?

If it is the case how can I manage operation such as MatIncreaseOverlap?
If it is not the case, does it exist a way to do that in PETSc?

Thanks,

Thomas


More information about the petsc-users mailing list