What is the best solver a poisson type eqn.

Christian Klettner christian.klettner at ucl.ac.uk
Tue Jun 16 07:40:21 CDT 2009


Hi Matt,
I have tried to use MUMPS and came across the following problems. My
application solves Ax=b about 10000-100000 times with A remaining
constant.
When I tried to use it through KSP I was not finding good performance.
Could this be because it was refactoring etc. at each time step?
With this in mind I have tried to implement the following:

A is a parallel matrix created with MatCreateMPIAIJ().
rows is a list of the global row numbers on the process.cols is a list of
the global columns that are on the process.
I run the program with ./mpiexec -n 4 ./ex4 -pc_factor_mat_solver_package
mumps

(1)	MatFactorInfo *info;
(2)	ierr=MatFactorInfoInitialize(info);CHKERRQ(ierr);

(3)	ierr=MatGetFactor(A,MAT_SOLVER_MUMPS,MAT_FACTOR_LU,&F);CHKERRQ(ierr);
(4)	ierr=MatLUFactorSymbolic(F, A, rows, cols,info);CHKERRQ(ierr);
(5) 	MatLUFactorNumeric(F, A,info);CHKERRQ(ierr);


	for(){	///TEMPORAL LOOP

		/*CODING*/

		ierr=MatSolve(A,vecr,vecu);CHKERRQ(ierr);
	}

I get the following error messages:

Line (2) above gives the following error message:

[0]PETSC ERROR: --------------------- Error Message
------------------------------------
[2]PETSC ERROR: [0]PETSC ERROR: Null argument, when expecting valid pointer!
[0]PETSC ERROR: Trying to zero at a null pointer!
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.0.0, Patch 4, Fri Mar  6 14:46:08
CST 2009
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: ./ex4 on a linux-gnu named christian-desktop by christian
Tue Jun 16 13:33:33 2009
[0]PETSC ERROR: Libraries linked from
/home/christian/Desktop/petsc-3.0.0-p4/linux-gnu-c-debug/lib
[0]PETSC ERROR: Configure run at Mon Jun 15 17:05:31 2009
[0]PETSC ERROR: Configure options --with-cc="gcc -fPIC" --download-mpich=1
--download-f-blas-lapack --download-scalapack --download-blacs
--download-mumps --download-parmetis --download-hypre --download-triangle
--with-shared=0
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: PetscMemzero() line 189 in src/sys/utils/memc.c
[0]PETSC ERROR: MatFactorInfoInitialize() line 7123 in
src/mat/interface/matrix.c
[0]PETSC ERROR: main() line 1484 in src/dm/ao/examples/tutorials/ex4.c
application called MPI_Abort(MPI_COMM_WORLD, 85) - process 0[cli_0]:
aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 85) - process 0
--------------------- Error Message ------------------------------------
[2]PETSC ERROR: Null argument, when expecting valid pointer!
[2]PETSC ERROR: Trying to zero at a null pointer!
[2]PETSC ERROR:
------------------------------------------------------------------------
[2]PETSC ERROR: Petsc Release Version 3.0.0, Patch 4, Fri Mar  6 14:46:08
CST 2009
[2]PETSC ERROR: See docs/changes/index.html for recent updates.
[2]PETSC ERROR: See docs/faq.html[0]0:Return code = 85
[0]1:Return code = 0, signaled with Interrupt
[0]2:Return code = 0, signaled with Interrupt
[0]3:Return code = 0, signaled with Interrupt

/////////////////////////////////////////////////////////////////

Line (3) gives the following error messages:

[0]PETSC ERROR: --------------------- Error Message
------------------------------------
[0]PETSC ERROR: Invalid argument!
[0]PETSC ERROR: Wrong type of object: Parameter # 2!
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message
------------------------------------
[2]PETSC ERROR: Invalid argument!
[2]PETSC ERROR: Wrong type of object: Parameter # 2!
[2]PETSC ERROR:
------------------------------------------------------------------------
[2]PETSC ERROR: Petsc Release Version 3.0.0, Patch 4, Fri Mar  6 14:46:08
CST 2009
[2]PETSC ERROR: See docs/changes/index.html for recent updates.
[2]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[2]PETSC ERROR: See docs/index.html for manual pages.
[2]PETSC ERROR:
------------------------------------------------------------------------
[2]PETSC ERROR: ./ex4 on a linux-gnu named christian-desktop by christian
Tue Jun 16 13:37:36 2009
[2]PETSC ERROR: Libraries linked from
/home/christian/Desktop/petsc-3.0.0-p4/linux-gnu-c-debug/lib
[2]PETSC ERROR: Configure run at Mon Jun 15 17:05:31 2009
[2]PETSC ERROR: Configure options --with-cc="gcc -fPIC" --download-mpich=1
--download-f-blas-lapack --download-scalapack --download-blacs
--downlPetsc Release Version 3.0.0, Patch 4, Fri Mar  6 14:46:08 CST 2009
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: ./ex4 on a linux-gnu named christian-desktop by christian
Tue Jun 16 13:37:36 2009
[0]PETSC ERROR: Libraries linked from
/home/christian/Desktop/petsc-3.0.0-p4/linux-gnu-c-debug/lib
[0]PETSC ERROR: Configure run at Mon Jun 15 17:05:31 2009
[0]PETSC ERROR: Configure options --with-cc="gcc -fPIC" --download-mpich=1
--download-f-blas-lapack --download-scalapack --download-blacs
--download-mumps --download-parmetis --download-hypre --download-triangle
--with-shared=0
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: MatLUFactorSymbolic() line 2311 in src/mat/interface/matrix.c
[0]PETSC ERROR: main() line 148oad-mumps --download-parmetis
--download-hypre --download-triangle --with-shared=0
[2]PETSC ERROR:
------------------------------------------------------------------------
[2]PETSC ERROR: MatLUFactorSymbolic() line 2311 in src/mat/interface/matrix.c
[2]PETSC ERROR: main() line 1488 in src/dm/ao/examples/tutorials/ex4.c
application called MPI_Abort(MPI_COMM_WORLD, 62) - process 2[cli_2]:
aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 62) - process 2
8 in src/dm/ao/examples/tutorials/ex4.c
application called MPI_Abort(MPI_COMM_WORLD, 62) - process 0[cli_0]:
aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 62) - process 0
[0]0:Return code = 62
[0]1:Return code = 0, signaled with Interrupt
[0]2:Return code = 62
[0]3:Return code = 0, signaled with Interrupt

Is the matrix (ie parameter 2) in teh wrong state because it's not a MUMPS
matrix?

Any help would be greatly appreciated,
Best regards,
Christian Klettner



> The problem is small enough that you might be able to use MUMPS.
>
>   Matt
>
> On Fri, Jun 12, 2009 at 9:31 AM, Lisandro Dalcin <dalcinl at gmail.com>
> wrote:
>
>> On Fri, Jun 12, 2009 at 11:13 AM, Christian
>> Klettner<christian.klettner at ucl.ac.uk> wrote:
>> > Sorry that I sent this twice. No subject in the first one.
>> >
>> > Dear PETSc Team,
>> > I am writing a CFD finite element code in C. From the discretization
>> of
>> > the governing equations I have to solve a Poisson type equation which
>> is
>> > really killing my performance. Which solver/preconditioner from PETSc
>> or
>> > any external packages would you recommend? The size of my problem is
>> from
>> > ~30000-100000 DOF per core. What kind of performance would I be able
>> to
>> > expect with this solver/preconditioner?
>>
>> I would suggest KSPCG. As preconditioner I would use ML or
>> HYPRE/BoomerAMG (both are external packages)
>>
>> > I am using a 2*quad core 2.3 GHz Opteron. I have decomposed the domain
>> > with Parmetis. The mesh is unstructured.
>> > Also, I am writing a code which studies free surface phenomena so the
>> mesh
>> > is continually changing. Does this matter when choosing a
>> > solver/preconditioner? My left hand side matrix (A in Ax=b) does not
>> > change in time.
>>
>> ML has a faster setup that BoomerAMG, but the convergence is a bit
>> slower. If your A matrix do not change, then likely BoomerAMG will be
>> better for you. In any case, you can try both: just build PETSc with
>> both packages, then you can change the preconditioner by just passing
>> a command line option.
>>
>> >
>> > Best regards and thank you in advance,
>> > Christian Klettner
>> >
>>
>> Disclaimer: the convergence of multigrid preconditioners depends a lot
>> on your actual problem. What I've suggested is just my limited
>> experience in a few problems I've run solving electric potentials.
>>
>>
>> --
>> Lisandro Dalcín
>> ---------------
>> Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC)
>> Instituto de Desarrollo Tecnológico para la Industria Química (INTEC)
>> Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)
>> PTLC - Güemes 3450, (3000) Santa Fe, Argentina
>> Tel/Fax: +54-(0)342-451.1594
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments
> is infinitely more interesting than any results to which their experiments
> lead.
> -- Norbert Wiener
>




More information about the petsc-users mailing list