[petsc-users] Different solution while running in parallel

Karthikeyan Chockalingam - STFC UKRI karthikeyan.chockalingam at stfc.ac.uk
Mon Nov 28 07:26:03 CST 2022


Thank you Mark and Matt for your response.

@Mark Adams<mailto:mfadams at lbl.gov> that sounds right because when I used cg and preconditioned with jacobi it worked in parallel.

Best,
Karthik.


From: Mark Adams <mfadams at lbl.gov>
Date: Monday, 28 November 2022 at 13:18
To: Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk>
Cc: Matthew Knepley <knepley at gmail.com>, Zhang, Hong <hzhang at mcs.anl.gov>, petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Different solution while running in parallel
Maybe I am missing something but I don't see MUMPS in the ksp_view. Just LU.
The built-in LU is not parallel and so with > 1 MPI process you get one iteration of block Jacobi, so these results look fine to me, at least up to this point.

Mark

On Thu, Nov 17, 2022 at 9:37 AM Karthikeyan Chockalingam - STFC UKRI via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
Hi Matt and Hong,

Thank you for your response.
I made the following changes, to get the desired output


    PetscReal norm; /* norm of solution error */

    PetscInt  its;

    KSPConvergedReason reason;

    PetscViewerAndFormat *vf;

    PetscViewerAndFormatCreate(PETSC_VIEWER_STDOUT_WORLD, PETSC_VIEWER_DEFAULT, &vf);

    ierr = KSPView(ksp, PETSC_VIEWER_STDOUT_SELF);CHKERRQ(ierr);

    KSPSolve(ksp, b, x);

    ierr = KSPMonitorTrueResidual(ksp,its,norm,vf);CHKERRQ(ierr);
    ierr = KSPMonitorSingularValue(ksp, its, norm, vf);CHKERRQ(ierr);

I have attached the outputs from both the runs. As before, I am also printing A, b, and x.

I wonder if it is a memory issue related to mpi library employed. I am currently using openmpi – should I instead use mpich?

Kind regards,
Karthik.

From: Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>>
Date: Thursday, 17 November 2022 at 12:19
To: Zhang, Hong <hzhang at mcs.anl.gov<mailto:hzhang at mcs.anl.gov>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>, Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk<mailto:karthikeyan.chockalingam at stfc.ac.uk>>
Subject: Re: [petsc-users] Different solution while running in parallel
On Wed, Nov 16, 2022 at 9:07 PM Zhang, Hong via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
Karhik,
Can you find out the condition number of your matrix?

Also, run using

  -ksp_view -ksp_monitor_true_residual -ksp_converged_reason

and send the two outputs.

  Thanks,

      Matt

Hong

________________________________
From: petsc-users <petsc-users-bounces at mcs.anl.gov<mailto:petsc-users-bounces at mcs.anl.gov>> on behalf of Karthikeyan Chockalingam - STFC UKRI via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Sent: Wednesday, November 16, 2022 6:04 PM
To: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: [petsc-users] Different solution while running in parallel


  Hello,



   I tried to solve a (FE discretized) Poisson equation using PCLU. For some reason I am getting different solutions while running the problem on one and two cores. I have attached the output file (out.txt) from both the runs. I am printing A, b and x from both the runs – while A and b are the same but the solution seems is different.



I am not sure what I doing wrong.



Below is my matrix, vector, and solve setup.





    Mat A;

    Vec b, x;



    ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr);

    ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr);

    ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ(ierr);

    ierr = MatMPIAIJSetPreallocation(A,d_nz, NULL, o_nz, NULL); CHKERRQ(ierr);

    ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE); CHKERRQ(ierr);

    ierr = MatCreateVecs(A, &b, &x); CHKERRQ(ierr);



    KSP           ksp;

    PC            pc;

    KSPCreate(PETSC_COMM_WORLD, &ksp);

    KSPSetOperators(ksp, A, A);

    ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr);

    ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr);

    ierr = PCSetType(pc,PCLU);CHKERRQ(ierr);

    ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr);

    KSPSolve(ksp, b, x);



Thank you for your help.



Karhik.



This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses.


--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20221128/eefab52e/attachment-0001.html>


More information about the petsc-users mailing list