[petsc-users] DMDA Error
Zhang, Junchao
jczhang at mcs.anl.gov
Tue Jan 21 10:20:55 CST 2020
I submitted a job and I am waiting for the result.
--Junchao Zhang
On Tue, Jan 21, 2020 at 3:03 AM Dave May <dave.mayhem23 at gmail.com<mailto:dave.mayhem23 at gmail.com>> wrote:
Hi Anthony,
On Tue, 21 Jan 2020 at 08:25, Anthony Jourdon <jourdon_anthony at hotmail.fr<mailto:jourdon_anthony at hotmail.fr>> wrote:
Hello,
I made a test to try to reproduce the error.
To do so I modified the file $PETSC_DIR/src/dm/examples/tests/ex35.c
I attach the file in case of need.
The same error is reproduced for 1024 mpi ranks. I tested two problem sizes (2*512+1x2*64+1x2*256+1 and 2*1024+1x2*128+1x2*512+1) and the error occured for both cases, the first case is also the one I used to run before the OS and mpi updates.
I also run the code with -malloc_debug and nothing more appeared.
I attached the configure command I used to build a debug version of petsc.
The error indicates the problem occurs on the bold line below (e.g. within MPI_Isend())
/* Post the Isends with the message length-info */
for (i=0,j=0; i<size; ++i) {
if (ilengths[i]) {
ierr = MPI_Isend((void*)(ilengths+i),1,MPI_INT,i,tag,comm,s_waits+j);CHKERRQ(ierr);
j++;
}
}
The type of ilengths[i] is PetscMPIInt, which is always typedef'd to int inside PETSc.
I don't see how any integer mis-match (int vs long int) can be occurring so I'm puzzled what the problem is.
Weird...
Thanks
Dave
Thank you for your time,
Sincerly.
Anthony Jourdon
________________________________
De : Zhang, Junchao <jczhang at mcs.anl.gov<mailto:jczhang at mcs.anl.gov>>
Envoyé : jeudi 16 janvier 2020 16:49
À : Anthony Jourdon <jourdon_anthony at hotmail.fr<mailto:jourdon_anthony at hotmail.fr>>
Cc : petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Objet : Re: [petsc-users] DMDA Error
It seems the problem is triggered by DMSetUp. You can write a small test creating the DMDA with the same size as your code, to see if you can reproduce the problem. If yes, it would be much easier for us to debug it.
--Junchao Zhang
On Thu, Jan 16, 2020 at 7:38 AM Anthony Jourdon <jourdon_anthony at hotmail.fr<mailto:jourdon_anthony at hotmail.fr>> wrote:
Dear Petsc developer,
I need assistance with an error.
I run a code that uses the DMDA related functions. I'm using petsc-3.8.4.
This code used to run very well on a super computer with the OS SLES11.
Petsc was built using an intel mpi 5.1.3.223 module and intel mkl version 2016.0.2.181
The code was running with no problem on 1024 and more mpi ranks.
Recently, the OS of the computer has been updated to RHEL7
I rebuilt Petsc using new available versions of intel mpi (2019U5) and mkl (2019.0.5.281) which are the same versions for compilers and mkl.
Since then I tested to run the exact same code on 8, 16, 24, 48, 512 and 1024 mpi ranks.
Until 1024 mpi ranks no problem, but for 1024 an error related to DMDA appeared. I snip the first lines of the error stack here and the full error stack is attached.
[534]PETSC ERROR: #1 PetscGatherMessageLengths() line 120 in /scratch2/dlp/appli_local/SCR/OROGEN/petsc3.8.4_MPI/petsc-3.8.4/src/sys/utils/mpimesg.c
[534]PETSC ERROR: #2 VecScatterCreate_PtoS() line 2288 in /scratch2/dlp/appli_local/SCR/OROGEN/petsc3.8.4_MPI/petsc-3.8.4/src/vec/vec/utils/vpscat.c
[534]PETSC ERROR: #3 VecScatterCreate() line 1462 in /scratch2/dlp/appli_local/SCR/OROGEN/petsc3.8.4_MPI/petsc-3.8.4/src/vec/vec/utils/vscat.c
[534]PETSC ERROR: #4 DMSetUp_DA_3D() line 1042 in /scratch2/dlp/appli_local/SCR/OROGEN/petsc3.8.4_MPI/petsc-3.8.4/src/dm/impls/da/da3.c
[534]PETSC ERROR: #5 DMSetUp_DA() line 25 in /scratch2/dlp/appli_local/SCR/OROGEN/petsc3.8.4_MPI/petsc-3.8.4/src/dm/impls/da/dareg.c
[534]PETSC ERROR: #6 DMSetUp() line 720 in /scratch2/dlp/appli_local/SCR/OROGEN/petsc3.8.4_MPI/petsc-3.8.4/src/dm/interface/dm.c
Thank you for your time,
Sincerly,
Anthony Jourdon
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200121/8dc1f59f/attachment-0001.html>
More information about the petsc-users
mailing list