From knepley at gmail.com Sat Oct 1 18:46:47 2022 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 1 Oct 2022 19:46:47 -0400 Subject: [petsc-users] DMLocalToLocal with DMPlex in Fortran In-Reply-To: References: Message-ID: On Fri, Sep 30, 2022 at 4:14 PM Mike Michell wrote: > Hi, > > As a follow-up to this email thread, > https://www.mail-archive.com/petsc-users at mcs.anl.gov/msg44070.html > > Are DMLocalToLocalBegin() and DMLocalToLocalEnd() really available for > DMPlex with Fortran on the latest version of PETSc (3.17.99 from GitLab)? > Matt commented that the Fortran bindings were updated so that those > functions must be available in the latest version of PETSc, however, it > seems still they are not working from my test with DMPlex in Fortran. Can > anyone provide some comments? Probably I am missing some mandatory header > file? Currently, I have headers; > > #include "petsc/finclude/petscvec.h" > #include "petsc/finclude/petscdmplex.h" > #include "petsc/finclude/petscdmlabel.h" > #include "petsc/finclude/petscdm.h" > The source for these functions is in src/dm/ftn-auto/dmf.c Is it there for you? If not, you can run make allfortranstubs Fortran functions are not declared, so the header should not matter for compilation, just the libraries for linking. Thanks, Matt > Thanks, > Mike > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mi.mike1021 at gmail.com Sat Oct 1 19:51:31 2022 From: mi.mike1021 at gmail.com (Mike Michell) Date: Sat, 1 Oct 2022 19:51:31 -0500 Subject: [petsc-users] DMLocalToLocal with DMPlex in Fortran In-Reply-To: References: Message-ID: Thank you for the reply. There is that file in src/dm/interface/ftn-auto/ for me, instead of the path you mentioned. After "make allfortranstubs" was done and, PETSc reconfigured and reinstalled. However, I still have the same problem at the line in which DMLocalToLocalBegin() is used. What I am doing to setup halo exchange is as follows; - declare DMPlex - PetscSectionCreate() - PetscSectionSetNumFields() - PetscSectionSetFieldComponents() - PetscSectionSetChart() - do loop over dofs: PetscSectionSetDof() and PetscSectionSetFieldDof() - PetscSectionSetUp() - DMSetLocalSection() - PetscSectionDestroy() - DMGetSectionSF() - PetscSFSetUp() Then, the halo exchange is performed as follows; - DMGetLocalVector() - Fill the local vector - DMLocalToLocalBegin() --(*) - DMLocalToLocalEnd() - DMRestoreLocalVector() Then, the code crashes at (*). Previously(at the time PETSc did not support LocalToLocal for DMPlex in Fortran), the part above, "DMLocalToLocalBegin() and DMLocalToLocalEnd()", consisted of; - DMLocalToGlobalBegin() - DMLocalToGlobalEnd() - DMGlobalToLocalBegin() - DMGlobalToLocalEnd() and it worked okay. I am unclear which part is causing the problem. Shall I define the PetscSection and PetscSF in different ways in case of Local to Local Halo exchange? Any comment will be appreciated. Thanks, Mike > On Fri, Sep 30, 2022 at 4:14 PM Mike Michell > wrote: > >> Hi, >> >> As a follow-up to this email thread, >> https://www.mail-archive.com/petsc-users at mcs.anl.gov/msg44070.html >> >> Are DMLocalToLocalBegin() and DMLocalToLocalEnd() really available for >> DMPlex with Fortran on the latest version of PETSc (3.17.99 from GitLab)? >> Matt commented that the Fortran bindings were updated so that those >> functions must be available in the latest version of PETSc, however, it >> seems still they are not working from my test with DMPlex in Fortran. Can >> anyone provide some comments? Probably I am missing some mandatory header >> file? Currently, I have headers; >> >> #include "petsc/finclude/petscvec.h" >> #include "petsc/finclude/petscdmplex.h" >> #include "petsc/finclude/petscdmlabel.h" >> #include "petsc/finclude/petscdm.h" >> > > The source for these functions is in > > src/dm/ftn-auto/dmf.c > > Is it there for you? If not, you can run > > make allfortranstubs > > Fortran functions are not declared, so the header should not matter for > compilation, just the libraries for linking. > > Thanks, > > Matt > > >> Thanks, >> Mike >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Oct 2 06:25:55 2022 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 2 Oct 2022 07:25:55 -0400 Subject: [petsc-users] DMLocalToLocal with DMPlex in Fortran In-Reply-To: References: Message-ID: On Sat, Oct 1, 2022 at 8:51 PM Mike Michell wrote: > Thank you for the reply. There is that file in src/dm/interface/ftn-auto/ > for me, instead of the path you mentioned. > > After "make allfortranstubs" was done and, PETSc reconfigured and > reinstalled. > > However, I still have the same problem at the line in which > DMLocalToLocalBegin() is used. What I am doing to setup halo exchange is as > follows; > - declare DMPlex > - PetscSectionCreate() > - PetscSectionSetNumFields() > - PetscSectionSetFieldComponents() > - PetscSectionSetChart() > - do loop over dofs: PetscSectionSetDof() and PetscSectionSetFieldDof() > - PetscSectionSetUp() > - DMSetLocalSection() > - PetscSectionDestroy() > - DMGetSectionSF() > - PetscSFSetUp() > > Then, the halo exchange is performed as follows; > - DMGetLocalVector() > - Fill the local vector > - DMLocalToLocalBegin() --(*) > - DMLocalToLocalEnd() > - DMRestoreLocalVector() > > Then, the code crashes at (*). > Can you send something I can run? Then I will find the problem and fix it. Thanks, Matt > Previously(at the time PETSc did not support LocalToLocal for DMPlex in > Fortran), the part above, "DMLocalToLocalBegin() and DMLocalToLocalEnd()", > consisted of; > - DMLocalToGlobalBegin() > - DMLocalToGlobalEnd() > - DMGlobalToLocalBegin() > - DMGlobalToLocalEnd() > and it worked okay. > > I am unclear which part is causing the problem. Shall I define the > PetscSection and PetscSF in different ways in case of Local to Local Halo > exchange? > Any comment will be appreciated. > > Thanks, > Mike > > > >> On Fri, Sep 30, 2022 at 4:14 PM Mike Michell >> wrote: >> >>> Hi, >>> >>> As a follow-up to this email thread, >>> https://www.mail-archive.com/petsc-users at mcs.anl.gov/msg44070.html >>> >>> Are DMLocalToLocalBegin() and DMLocalToLocalEnd() really available for >>> DMPlex with Fortran on the latest version of PETSc (3.17.99 from GitLab)? >>> Matt commented that the Fortran bindings were updated so that those >>> functions must be available in the latest version of PETSc, however, it >>> seems still they are not working from my test with DMPlex in Fortran. Can >>> anyone provide some comments? Probably I am missing some mandatory header >>> file? Currently, I have headers; >>> >>> #include "petsc/finclude/petscvec.h" >>> #include "petsc/finclude/petscdmplex.h" >>> #include "petsc/finclude/petscdmlabel.h" >>> #include "petsc/finclude/petscdm.h" >>> >> >> The source for these functions is in >> >> src/dm/ftn-auto/dmf.c >> >> Is it there for you? If not, you can run >> >> make allfortranstubs >> >> Fortran functions are not declared, so the header should not matter for >> compilation, just the libraries for linking. >> >> Thanks, >> >> Matt >> >> >>> Thanks, >>> Mike >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mi.mike1021 at gmail.com Sun Oct 2 10:56:13 2022 From: mi.mike1021 at gmail.com (Mike Michell) Date: Sun, 2 Oct 2022 10:56:13 -0500 Subject: [petsc-users] DMLocalToLocal with DMPlex in Fortran In-Reply-To: References: Message-ID: Thank you for the reply. Sure, a short example code is attached here with a square box mesh and a run script. Inside the source, you may find two versions of halo exchange; one is for local to global (Version-1) and another one is for local to local (Version-2), which is not working in my case. In the output.vtu, you will see the halo exchanged vector resolved to each vertex with (myrank + 1), so if the code is running with 2procs, at the parallel boundary, you will see 3. In this example, there is no ghost layer. Thanks, Mike > On Sat, Oct 1, 2022 at 8:51 PM Mike Michell wrote: > >> Thank you for the reply. There is that file in src/dm/interface/ftn-auto/ >> for me, instead of the path you mentioned. >> >> After "make allfortranstubs" was done and, PETSc reconfigured and >> reinstalled. >> >> However, I still have the same problem at the line in which >> DMLocalToLocalBegin() is used. What I am doing to setup halo exchange is as >> follows; >> - declare DMPlex >> - PetscSectionCreate() >> - PetscSectionSetNumFields() >> - PetscSectionSetFieldComponents() >> - PetscSectionSetChart() >> - do loop over dofs: PetscSectionSetDof() and PetscSectionSetFieldDof() >> - PetscSectionSetUp() >> - DMSetLocalSection() >> - PetscSectionDestroy() >> - DMGetSectionSF() >> - PetscSFSetUp() >> >> Then, the halo exchange is performed as follows; >> - DMGetLocalVector() >> - Fill the local vector >> - DMLocalToLocalBegin() --(*) >> - DMLocalToLocalEnd() >> - DMRestoreLocalVector() >> >> Then, the code crashes at (*). >> > > Can you send something I can run? Then I will find the problem and fix it. > > Thanks, > > Matt > > >> Previously(at the time PETSc did not support LocalToLocal for DMPlex in >> Fortran), the part above, "DMLocalToLocalBegin() and DMLocalToLocalEnd()", >> consisted of; >> - DMLocalToGlobalBegin() >> - DMLocalToGlobalEnd() >> - DMGlobalToLocalBegin() >> - DMGlobalToLocalEnd() >> and it worked okay. >> >> I am unclear which part is causing the problem. Shall I define the >> PetscSection and PetscSF in different ways in case of Local to Local Halo >> exchange? >> Any comment will be appreciated. >> >> Thanks, >> Mike >> >> >> >>> On Fri, Sep 30, 2022 at 4:14 PM Mike Michell >>> wrote: >>> >>>> Hi, >>>> >>>> As a follow-up to this email thread, >>>> https://www.mail-archive.com/petsc-users at mcs.anl.gov/msg44070.html >>>> >>>> Are DMLocalToLocalBegin() and DMLocalToLocalEnd() really available for >>>> DMPlex with Fortran on the latest version of PETSc (3.17.99 from GitLab)? >>>> Matt commented that the Fortran bindings were updated so that those >>>> functions must be available in the latest version of PETSc, however, it >>>> seems still they are not working from my test with DMPlex in Fortran. Can >>>> anyone provide some comments? Probably I am missing some mandatory header >>>> file? Currently, I have headers; >>>> >>>> #include "petsc/finclude/petscvec.h" >>>> #include "petsc/finclude/petscdmplex.h" >>>> #include "petsc/finclude/petscdmlabel.h" >>>> #include "petsc/finclude/petscdm.h" >>>> >>> >>> The source for these functions is in >>> >>> src/dm/ftn-auto/dmf.c >>> >>> Is it there for you? If not, you can run >>> >>> make allfortranstubs >>> >>> Fortran functions are not declared, so the header should not matter for >>> compilation, just the libraries for linking. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> Thanks, >>>> Mike >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Test_Halo.tar Type: application/x-tar Size: 81920 bytes Desc: not available URL: From fujisan43 at gmail.com Mon Oct 3 04:08:48 2022 From: fujisan43 at gmail.com (fujisan) Date: Mon, 3 Oct 2022 11:08:48 +0200 Subject: [petsc-users] Differences between main and release branches? Message-ID: Hi everyone, What are the differences between the 'main' and 'release' branches? Where I git cloned version 3.17.4, I was by default in the 'main' branch. Where I git cloned version 3.18.0 (I haven't git pulled from 3.17.4 yet), I was by default in the 'release' branch. Fuji -------------- next part -------------- An HTML attachment was scrubbed... URL: From paololampitella at hotmail.com Mon Oct 3 04:43:19 2022 From: paololampitella at hotmail.com (Paolo Lampitella) Date: Mon, 3 Oct 2022 09:43:19 +0000 Subject: [petsc-users] How to use Intel OneApi mpi wrappers on Linux Message-ID: Dear PETSc users and developers, as per the title, I recently installed the base and HPC Intel OneApi toolkits on a machine running Ubuntu 20.04. As you probably know, OneApi comes with the classical compilers (icc, icpc, ifort) and relative mpi wrappers (mpiicc, mpiicpc, mpiifort) as well as with the new LLVM based compilers (icx, icpx, ifx). My experience so far with PETSc on Linux has been without troubles using both gcc compilers and either Mpich or OpenMPI and Intel classical compilers and MPI. However, I have now troubles using the MPI wrappers of the new LLVM compilers as, in fact, there aren?t dedicated mpi wrappers for them. Instead, they can be used with certain flags for the classical wrappers: mpiicc -cc=icx mpiicpc -cxx=icpx mpiifort -fc=ifx The problem I have is that I have no idea how to pass them correctly to the configure and whatever comes after that. Admittedly, I am just starting to use the new compilers, so I have no clue how I would use them in other projects as well. I started with an alias in my .bash_aliases (which works for simple compilation tests from command line) but doesn?t with configure. I also tried adding the flags to the COPTFLAGS, CXXOPTFLAGS and FOPTFLAGS but didn?t work as well. Do you have any experience with the new Intel compilers and, in case, could you share hot to properly use them with MPI? Thanks Paolo -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Mon Oct 3 04:48:43 2022 From: jroman at dsic.upv.es (Jose E. Roman) Date: Mon, 3 Oct 2022 11:48:43 +0200 Subject: [petsc-users] Differences between main and release branches? In-Reply-To: References: Message-ID: 'main' is the development version, 'release' is the latest release version. You can select the branch when cloning or later with git checkout. See https://petsc.org/release/install/download/#recommended-download Jose > El 3 oct 2022, a las 11:08, fujisan escribi?: > > Hi everyone, > What are the differences between the 'main' and 'release' branches? > > Where I git cloned version 3.17.4, I was by default in the 'main' branch. > Where I git cloned version 3.18.0 (I haven't git pulled from 3.17.4 yet), I was by default in the 'release' branch. > > Fuji From mfadams at lbl.gov Mon Oct 3 06:19:54 2022 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 3 Oct 2022 07:19:54 -0400 Subject: [petsc-users] How to use Intel OneApi mpi wrappers on Linux In-Reply-To: References: Message-ID: Hi Paolo, You can use things like this in your configure file to set compilers and options. And you want to send us your configure.log file if it fails. Mark '--with-cc=gcc-11', '--with-cxx=g++-11', '--with-fc=gfortran-11', 'CFLAGS=-g', 'CXXFLAGS=-g', 'COPTFLAGS=-O0', 'CXXOPTFLAGS=-O0', On Mon, Oct 3, 2022 at 5:43 AM Paolo Lampitella wrote: > Dear PETSc users and developers, > > > > as per the title, I recently installed the base and HPC Intel OneApi > toolkits on a machine running Ubuntu 20.04. > > > > As you probably know, OneApi comes with the classical compilers (icc, > icpc, ifort) and relative mpi wrappers (mpiicc, mpiicpc, mpiifort) as well > as with the new LLVM based compilers (icx, icpx, ifx). > > > > My experience so far with PETSc on Linux has been without troubles using > both gcc compilers and either Mpich or OpenMPI and Intel classical > compilers and MPI. > > > > However, I have now troubles using the MPI wrappers of the new LLVM > compilers as, in fact, there aren?t dedicated mpi wrappers for them. > Instead, they can be used with certain flags for the classical wrappers: > > > > mpiicc -cc=icx > > mpiicpc -cxx=icpx > > mpiifort -fc=ifx > > > > The problem I have is that I have no idea how to pass them correctly to > the configure and whatever comes after that. > > > > Admittedly, I am just starting to use the new compilers, so I have no clue > how I would use them in other projects as well. > > > > I started with an alias in my .bash_aliases (which works for simple > compilation tests from command line) but doesn?t with configure. > > > > I also tried adding the flags to the COPTFLAGS, CXXOPTFLAGS and FOPTFLAGS > but didn?t work as well. > > > > Do you have any experience with the new Intel compilers and, in case, > could you share hot to properly use them with MPI? > > > > Thanks > > > > Paolo > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Mon Oct 3 08:18:56 2022 From: bsmith at petsc.dev (Barry Smith) Date: Mon, 3 Oct 2022 09:18:56 -0400 Subject: [petsc-users] How to use Intel OneApi mpi wrappers on Linux In-Reply-To: References: Message-ID: bsmith at petsc-01:~$ mpicc This script invokes an appropriate specialized C MPI compiler driver. The following ways (priority order) can be used for changing default compiler name (gcc): 1. Command line option: -cc= 2. Environment variable: I_MPI_CC (current value '') 3. Environment variable: MPICH_CC (current value '') So export I_MPI_CC=icx export I_MPI_CXX=icpx export I_MPI_FC=ifx should do the trick. > On Oct 3, 2022, at 5:43 AM, Paolo Lampitella wrote: > > Dear PETSc users and developers, > > as per the title, I recently installed the base and HPC Intel OneApi toolkits on a machine running Ubuntu 20.04. > > As you probably know, OneApi comes with the classical compilers (icc, icpc, ifort) and relative mpi wrappers (mpiicc, mpiicpc, mpiifort) as well as with the new LLVM based compilers (icx, icpx, ifx). > > My experience so far with PETSc on Linux has been without troubles using both gcc compilers and either Mpich or OpenMPI and Intel classical compilers and MPI. > > However, I have now troubles using the MPI wrappers of the new LLVM compilers as, in fact, there aren?t dedicated mpi wrappers for them. Instead, they can be used with certain flags for the classical wrappers: > > mpiicc -cc=icx > mpiicpc -cxx=icpx > mpiifort -fc=ifx > > The problem I have is that I have no idea how to pass them correctly to the configure and whatever comes after that. > > Admittedly, I am just starting to use the new compilers, so I have no clue how I would use them in other projects as well. > > I started with an alias in my .bash_aliases (which works for simple compilation tests from command line) but doesn?t with configure. > > I also tried adding the flags to the COPTFLAGS, CXXOPTFLAGS and FOPTFLAGS but didn?t work as well. > > Do you have any experience with the new Intel compilers and, in case, could you share hot to properly use them with MPI? > > Thanks > > Paolo -------------- next part -------------- An HTML attachment was scrubbed... URL: From paololampitella at hotmail.com Mon Oct 3 08:20:50 2022 From: paololampitella at hotmail.com (Paolo Lampitella) Date: Mon, 3 Oct 2022 13:20:50 +0000 Subject: [petsc-users] R: How to use Intel OneApi mpi wrappers on Linux In-Reply-To: References: Message-ID: Hi Mark, thank you very much, problem solved! I was indeed making confusion between OPTFLAGS and FLAGS. Now, I know that this is probably not the place for this but, as I still owe you a configure.log, what happened next is that I added hypre to the previous configuration (now working) and I had problems again in configure (log file attached). If I remove ?--download-hypre? from the configure command, as I said, everything works as expected. This also worked with the intel classical compilers (that is, if I remove again the CFLAGS, CXXFLAGS and FFLAGS options that fixed my configure without hypre). My catch here is that HYPRE seems to interpret the C/CXX compilers as GNU (instead of intel), and later fails in linking C with Fortran. I don?t actually need Hypre for now, but if you have any clue on where to look next, that would be helpful Thanks again Paolo Da: Mark Adams Inviato: luned? 3 ottobre 2022 13:20 A: Paolo Lampitella Cc: petsc-users at mcs.anl.gov Oggetto: Re: [petsc-users] How to use Intel OneApi mpi wrappers on Linux Hi Paolo, You can use things like this in your configure file to set compilers and options. And you want to send us your configure.log file if it fails. Mark '--with-cc=gcc-11', '--with-cxx=g++-11', '--with-fc=gfortran-11', 'CFLAGS=-g', 'CXXFLAGS=-g', 'COPTFLAGS=-O0', 'CXXOPTFLAGS=-O0', On Mon, Oct 3, 2022 at 5:43 AM Paolo Lampitella > wrote: Dear PETSc users and developers, as per the title, I recently installed the base and HPC Intel OneApi toolkits on a machine running Ubuntu 20.04. As you probably know, OneApi comes with the classical compilers (icc, icpc, ifort) and relative mpi wrappers (mpiicc, mpiicpc, mpiifort) as well as with the new LLVM based compilers (icx, icpx, ifx). My experience so far with PETSc on Linux has been without troubles using both gcc compilers and either Mpich or OpenMPI and Intel classical compilers and MPI. However, I have now troubles using the MPI wrappers of the new LLVM compilers as, in fact, there aren?t dedicated mpi wrappers for them. Instead, they can be used with certain flags for the classical wrappers: mpiicc -cc=icx mpiicpc -cxx=icpx mpiifort -fc=ifx The problem I have is that I have no idea how to pass them correctly to the configure and whatever comes after that. Admittedly, I am just starting to use the new compilers, so I have no clue how I would use them in other projects as well. I started with an alias in my .bash_aliases (which works for simple compilation tests from command line) but doesn?t with configure. I also tried adding the flags to the COPTFLAGS, CXXOPTFLAGS and FOPTFLAGS but didn?t work as well. Do you have any experience with the new Intel compilers and, in case, could you share hot to properly use them with MPI? Thanks Paolo -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 1104959 bytes Desc: configure.log URL: From paololampitella at hotmail.com Mon Oct 3 08:58:28 2022 From: paololampitella at hotmail.com (Paolo Lampitella) Date: Mon, 3 Oct 2022 13:58:28 +0000 Subject: [petsc-users] R: How to use Intel OneApi mpi wrappers on Linux In-Reply-To: References: Message-ID: Hi Barry, thanks for the suggestion. I tried this but doesn?t seem to work as expected. That is, configure actually works, but it is because it is not seeing the LLVM based compilers, only the intel classical ones. Yet the variables seem correctly exported. Paolo Da: Barry Smith Inviato: luned? 3 ottobre 2022 15:19 A: Paolo Lampitella Cc: petsc-users at mcs.anl.gov Oggetto: Re: [petsc-users] How to use Intel OneApi mpi wrappers on Linux bsmith at petsc-01:~$ mpicc This script invokes an appropriate specialized C MPI compiler driver. The following ways (priority order) can be used for changing default compiler name (gcc): 1. Command line option: -cc= 2. Environment variable: I_MPI_CC (current value '') 3. Environment variable: MPICH_CC (current value '') So export I_MPI_CC=icx export I_MPI_CXX=icpx export I_MPI_FC=ifx should do the trick. On Oct 3, 2022, at 5:43 AM, Paolo Lampitella > wrote: Dear PETSc users and developers, as per the title, I recently installed the base and HPC Intel OneApi toolkits on a machine running Ubuntu 20.04. As you probably know, OneApi comes with the classical compilers (icc, icpc, ifort) and relative mpi wrappers (mpiicc, mpiicpc, mpiifort) as well as with the new LLVM based compilers (icx, icpx, ifx). My experience so far with PETSc on Linux has been without troubles using both gcc compilers and either Mpich or OpenMPI and Intel classical compilers and MPI. However, I have now troubles using the MPI wrappers of the new LLVM compilers as, in fact, there aren?t dedicated mpi wrappers for them. Instead, they can be used with certain flags for the classical wrappers: mpiicc -cc=icx mpiicpc -cxx=icpx mpiifort -fc=ifx The problem I have is that I have no idea how to pass them correctly to the configure and whatever comes after that. Admittedly, I am just starting to use the new compilers, so I have no clue how I would use them in other projects as well. I started with an alias in my .bash_aliases (which works for simple compilation tests from command line) but doesn?t with configure. I also tried adding the flags to the COPTFLAGS, CXXOPTFLAGS and FOPTFLAGS but didn?t work as well. Do you have any experience with the new Intel compilers and, in case, could you share hot to properly use them with MPI? Thanks Paolo -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Mon Oct 3 10:01:23 2022 From: bsmith at petsc.dev (Barry Smith) Date: Mon, 3 Oct 2022 11:01:23 -0400 Subject: [petsc-users] How to use Intel OneApi mpi wrappers on Linux In-Reply-To: References: Message-ID: <2EF375A7-D5DC-49CD-B8C8-2A16AA2B4EB6@petsc.dev> That is indeed disappointing. mpicc and mpiicc are simple scripts that select the compiler based on multiple criteria include the environmental variables so it is curious that this functionality does not work. Barry > On Oct 3, 2022, at 9:58 AM, Paolo Lampitella wrote: > > Hi Barry, > > thanks for the suggestion. I tried this but doesn?t seem to work as expected. That is, configure actually works, but it is because it is not seeing the LLVM based compilers, only the intel classical ones. Yet the variables seem correctly exported. > > Paolo > > > Da: Barry Smith > Inviato: luned? 3 ottobre 2022 15:19 > A: Paolo Lampitella > Cc: petsc-users at mcs.anl.gov > Oggetto: Re: [petsc-users] How to use Intel OneApi mpi wrappers on Linux > > > bsmith at petsc-01:~$ mpicc > This script invokes an appropriate specialized C MPI compiler driver. > The following ways (priority order) can be used for changing default > compiler name (gcc): > 1. Command line option: -cc= > 2. Environment variable: I_MPI_CC (current value '') > 3. Environment variable: MPICH_CC (current value '') > > > So > export I_MPI_CC=icx > export I_MPI_CXX=icpx > export I_MPI_FC=ifx > > should do the trick. > > > > On Oct 3, 2022, at 5:43 AM, Paolo Lampitella > wrote: > > Dear PETSc users and developers, > > as per the title, I recently installed the base and HPC Intel OneApi toolkits on a machine running Ubuntu 20.04. > > As you probably know, OneApi comes with the classical compilers (icc, icpc, ifort) and relative mpi wrappers (mpiicc, mpiicpc, mpiifort) as well as with the new LLVM based compilers (icx, icpx, ifx). > > My experience so far with PETSc on Linux has been without troubles using both gcc compilers and either Mpich or OpenMPI and Intel classical compilers and MPI. > > However, I have now troubles using the MPI wrappers of the new LLVM compilers as, in fact, there aren?t dedicated mpi wrappers for them. Instead, they can be used with certain flags for the classical wrappers: > > mpiicc -cc=icx > mpiicpc -cxx=icpx > mpiifort -fc=ifx > > The problem I have is that I have no idea how to pass them correctly to the configure and whatever comes after that. > > Admittedly, I am just starting to use the new compilers, so I have no clue how I would use them in other projects as well. > > I started with an alias in my .bash_aliases (which works for simple compilation tests from command line) but doesn?t with configure. > > I also tried adding the flags to the COPTFLAGS, CXXOPTFLAGS and FOPTFLAGS but didn?t work as well. > > Do you have any experience with the new Intel compilers and, in case, could you share hot to properly use them with MPI? > > Thanks > > Paolo -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon Oct 3 10:25:56 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 3 Oct 2022 10:25:56 -0500 (CDT) Subject: [petsc-users] How to use Intel OneApi mpi wrappers on Linux In-Reply-To: <2EF375A7-D5DC-49CD-B8C8-2A16AA2B4EB6@petsc.dev> References: <2EF375A7-D5DC-49CD-B8C8-2A16AA2B4EB6@petsc.dev> Message-ID: This is strange. It works for me [with is simple test] [balay at pj01 ~]$ mpiicc -show icc -I"/opt/intel/oneapi/mpi/2021.7.0/include" -L"/opt/intel/oneapi/mpi/2021.7.0/lib/release" -L"/opt/intel/oneapi/mpi/2021.7.0/lib" -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker "/opt/intel/oneapi/mpi/2021.7.0/lib/release" -Xlinker -rpath -Xlinker "/opt/intel/oneapi/mpi/2021.7.0/lib" -lmpifort -lmpi -ldl -lrt -lpthread [balay at pj01 ~]$ export I_MPI_CC=icx [balay at pj01 ~]$ mpiicc -show icx -I"/opt/intel/oneapi/mpi/2021.7.0/include" -L"/opt/intel/oneapi/mpi/2021.7.0/lib/release" -L"/opt/intel/oneapi/mpi/2021.7.0/lib" -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker "/opt/intel/oneapi/mpi/2021.7.0/lib/release" -Xlinker -rpath -Xlinker "/opt/intel/oneapi/mpi/2021.7.0/lib" -lmpifort -lmpi -ldl -lrt -lpthread Satish On Mon, 3 Oct 2022, Barry Smith wrote: > > That is indeed disappointing. mpicc and mpiicc are simple scripts that select the compiler based on multiple criteria include the environmental variables so it is curious that this functionality does not work. > > Barry > > > > On Oct 3, 2022, at 9:58 AM, Paolo Lampitella wrote: > > > > Hi Barry, > > > > thanks for the suggestion. I tried this but doesn?t seem to work as expected. That is, configure actually works, but it is because it is not seeing the LLVM based compilers, only the intel classical ones. Yet the variables seem correctly exported. > > > > Paolo > > > > > > Da: Barry Smith > > Inviato: luned? 3 ottobre 2022 15:19 > > A: Paolo Lampitella > > Cc: petsc-users at mcs.anl.gov > > Oggetto: Re: [petsc-users] How to use Intel OneApi mpi wrappers on Linux > > > > > > bsmith at petsc-01:~$ mpicc > > This script invokes an appropriate specialized C MPI compiler driver. > > The following ways (priority order) can be used for changing default > > compiler name (gcc): > > 1. Command line option: -cc= > > 2. Environment variable: I_MPI_CC (current value '') > > 3. Environment variable: MPICH_CC (current value '') > > > > > > So > > export I_MPI_CC=icx > > export I_MPI_CXX=icpx > > export I_MPI_FC=ifx > > > > should do the trick. > > > > > > > > On Oct 3, 2022, at 5:43 AM, Paolo Lampitella > wrote: > > > > Dear PETSc users and developers, > > > > as per the title, I recently installed the base and HPC Intel OneApi toolkits on a machine running Ubuntu 20.04. > > > > As you probably know, OneApi comes with the classical compilers (icc, icpc, ifort) and relative mpi wrappers (mpiicc, mpiicpc, mpiifort) as well as with the new LLVM based compilers (icx, icpx, ifx). > > > > My experience so far with PETSc on Linux has been without troubles using both gcc compilers and either Mpich or OpenMPI and Intel classical compilers and MPI. > > > > However, I have now troubles using the MPI wrappers of the new LLVM compilers as, in fact, there aren?t dedicated mpi wrappers for them. Instead, they can be used with certain flags for the classical wrappers: > > > > mpiicc -cc=icx > > mpiicpc -cxx=icpx > > mpiifort -fc=ifx > > > > The problem I have is that I have no idea how to pass them correctly to the configure and whatever comes after that. > > > > Admittedly, I am just starting to use the new compilers, so I have no clue how I would use them in other projects as well. > > > > I started with an alias in my .bash_aliases (which works for simple compilation tests from command line) but doesn?t with configure. > > > > I also tried adding the flags to the COPTFLAGS, CXXOPTFLAGS and FOPTFLAGS but didn?t work as well. > > > > Do you have any experience with the new Intel compilers and, in case, could you share hot to properly use them with MPI? > > > > Thanks > > > > Paolo > > From balay at mcs.anl.gov Mon Oct 3 10:29:31 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 3 Oct 2022 10:29:31 -0500 (CDT) Subject: [petsc-users] Differences between main and release branches? In-Reply-To: References: Message-ID: <5e15d5c2-ab9a-b58c-9f89-852ab0c4371e@mcs.anl.gov> What were the git commands used here [for each of these cases]? Normally you would checkout a branch - and pull on it. So "cloning 3.17.4" doesn't really make sense - as there is no "3.17.4" branch. you ether checkout a tag - v3.17.4 - then you don't get a branch to pull on. [sure you can do "git clone -b release" - but that's a branch]. So the statement "3.17.4 gave main branch, 3.18.0 gave release branch" doesn't really make sense to me [from the way git work] Satish On Mon, 3 Oct 2022, Jose E. Roman wrote: > 'main' is the development version, 'release' is the latest release version. > You can select the branch when cloning or later with git checkout. > See https://petsc.org/release/install/download/#recommended-download > > Jose > > > > El 3 oct 2022, a las 11:08, fujisan escribi?: > > > > Hi everyone, > > What are the differences between the 'main' and 'release' branches? > > > > Where I git cloned version 3.17.4, I was by default in the 'main' branch. > > Where I git cloned version 3.18.0 (I haven't git pulled from 3.17.4 yet), I was by default in the 'release' branch. > > > > Fuji > From mfadams at lbl.gov Mon Oct 3 12:09:55 2022 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 3 Oct 2022 13:09:55 -0400 Subject: [petsc-users] How to use Intel OneApi mpi wrappers on Linux In-Reply-To: References: Message-ID: You are getting a "-loopopt=0" added to your link line. No idea what that is or where it comes from. I don't see it in our repo. Does this come from your environment somehow? https://dl.acm.org/doi/abs/10.1145/3493229.3493301 On Mon, Oct 3, 2022 at 9:20 AM Paolo Lampitella wrote: > Hi Mark, > > > > thank you very much, problem solved! > > > > I was indeed making confusion between OPTFLAGS and FLAGS. > > > > Now, I know that this is probably not the place for this but, as I still > owe you a configure.log, what happened next is that I added hypre to the > previous configuration (now working) and I had problems again in configure > (log file attached). If I remove ?--download-hypre? from the configure > command, as I said, everything works as expected. This also worked with the > intel classical compilers (that is, if I remove again the CFLAGS, CXXFLAGS > and FFLAGS options that fixed my configure without hypre). > > > > My catch here is that HYPRE seems to interpret the C/CXX compilers as GNU > (instead of intel), and later fails in linking C with Fortran. > > > > I don?t actually need Hypre for now, but if you have any clue on where to > look next, that would be helpful > > > > Thanks again > > > > Paolo > > > > *Da: *Mark Adams > *Inviato: *luned? 3 ottobre 2022 13:20 > *A: *Paolo Lampitella > *Cc: *petsc-users at mcs.anl.gov > *Oggetto: *Re: [petsc-users] How to use Intel OneApi mpi wrappers on Linux > > > > Hi Paolo, > > > > You can use things like this in your configure file to set compilers and > options. > > > > And you want to send us your configure.log file if it fails. > > > > Mark > > > > '--with-cc=gcc-11', > '--with-cxx=g++-11', > '--with-fc=gfortran-11', > 'CFLAGS=-g', > 'CXXFLAGS=-g', > 'COPTFLAGS=-O0', > 'CXXOPTFLAGS=-O0', > > > > > > On Mon, Oct 3, 2022 at 5:43 AM Paolo Lampitella < > paololampitella at hotmail.com> wrote: > > Dear PETSc users and developers, > > > > as per the title, I recently installed the base and HPC Intel OneApi > toolkits on a machine running Ubuntu 20.04. > > > > As you probably know, OneApi comes with the classical compilers (icc, > icpc, ifort) and relative mpi wrappers (mpiicc, mpiicpc, mpiifort) as well > as with the new LLVM based compilers (icx, icpx, ifx). > > > > My experience so far with PETSc on Linux has been without troubles using > both gcc compilers and either Mpich or OpenMPI and Intel classical > compilers and MPI. > > > > However, I have now troubles using the MPI wrappers of the new LLVM > compilers as, in fact, there aren?t dedicated mpi wrappers for them. > Instead, they can be used with certain flags for the classical wrappers: > > > > mpiicc -cc=icx > > mpiicpc -cxx=icpx > > mpiifort -fc=ifx > > > > The problem I have is that I have no idea how to pass them correctly to > the configure and whatever comes after that. > > > > Admittedly, I am just starting to use the new compilers, so I have no clue > how I would use them in other projects as well. > > > > I started with an alias in my .bash_aliases (which works for simple > compilation tests from command line) but doesn?t with configure. > > > > I also tried adding the flags to the COPTFLAGS, CXXOPTFLAGS and FOPTFLAGS > but didn?t work as well. > > > > Do you have any experience with the new Intel compilers and, in case, > could you share hot to properly use them with MPI? > > > > Thanks > > > > Paolo > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From paololampitella at hotmail.com Mon Oct 3 12:29:26 2022 From: paololampitella at hotmail.com (Paolo Lampitella) Date: Mon, 3 Oct 2022 17:29:26 +0000 Subject: [petsc-users] R: How to use Intel OneApi mpi wrappers on Linux In-Reply-To: References: Message-ID: Not that I know of, today is the first time I?ve read of it. It actually happened few hours ago while googling for this issue, and the results with most things in common with my case were 3 now closed Issues on the spack repository (never heard of it). Seems something related to Autoconf up to 2.69 (2.7 has a patch). I actually verified that I have the last offending Autoconf version (2.69), but I didn?t really understand anything else of what I read, so I couldn?t make any further progress I guess that this kind of confirms that this is my current problem with the new OneApi compilers and hypre on my ubuntu 20.04 machine Thanks Paolo Da: Mark Adams Inviato: luned? 3 ottobre 2022 19:10 A: Paolo Lampitella Cc: petsc-users at mcs.anl.gov Oggetto: Re: [petsc-users] How to use Intel OneApi mpi wrappers on Linux You are getting a "-loopopt=0" added to your link line. No idea what that is or where it comes from. I don't see it in our repo. Does this come from your environment somehow? https://dl.acm.org/doi/abs/10.1145/3493229.3493301 On Mon, Oct 3, 2022 at 9:20 AM Paolo Lampitella > wrote: Hi Mark, thank you very much, problem solved! I was indeed making confusion between OPTFLAGS and FLAGS. Now, I know that this is probably not the place for this but, as I still owe you a configure.log, what happened next is that I added hypre to the previous configuration (now working) and I had problems again in configure (log file attached). If I remove ?--download-hypre? from the configure command, as I said, everything works as expected. This also worked with the intel classical compilers (that is, if I remove again the CFLAGS, CXXFLAGS and FFLAGS options that fixed my configure without hypre). My catch here is that HYPRE seems to interpret the C/CXX compilers as GNU (instead of intel), and later fails in linking C with Fortran. I don?t actually need Hypre for now, but if you have any clue on where to look next, that would be helpful Thanks again Paolo Da: Mark Adams Inviato: luned? 3 ottobre 2022 13:20 A: Paolo Lampitella Cc: petsc-users at mcs.anl.gov Oggetto: Re: [petsc-users] How to use Intel OneApi mpi wrappers on Linux Hi Paolo, You can use things like this in your configure file to set compilers and options. And you want to send us your configure.log file if it fails. Mark '--with-cc=gcc-11', '--with-cxx=g++-11', '--with-fc=gfortran-11', 'CFLAGS=-g', 'CXXFLAGS=-g', 'COPTFLAGS=-O0', 'CXXOPTFLAGS=-O0', On Mon, Oct 3, 2022 at 5:43 AM Paolo Lampitella > wrote: Dear PETSc users and developers, as per the title, I recently installed the base and HPC Intel OneApi toolkits on a machine running Ubuntu 20.04. As you probably know, OneApi comes with the classical compilers (icc, icpc, ifort) and relative mpi wrappers (mpiicc, mpiicpc, mpiifort) as well as with the new LLVM based compilers (icx, icpx, ifx). My experience so far with PETSc on Linux has been without troubles using both gcc compilers and either Mpich or OpenMPI and Intel classical compilers and MPI. However, I have now troubles using the MPI wrappers of the new LLVM compilers as, in fact, there aren?t dedicated mpi wrappers for them. Instead, they can be used with certain flags for the classical wrappers: mpiicc -cc=icx mpiicpc -cxx=icpx mpiifort -fc=ifx The problem I have is that I have no idea how to pass them correctly to the configure and whatever comes after that. Admittedly, I am just starting to use the new compilers, so I have no clue how I would use them in other projects as well. I started with an alias in my .bash_aliases (which works for simple compilation tests from command line) but doesn?t with configure. I also tried adding the flags to the COPTFLAGS, CXXOPTFLAGS and FOPTFLAGS but didn?t work as well. Do you have any experience with the new Intel compilers and, in case, could you share hot to properly use them with MPI? Thanks Paolo -------------- next part -------------- An HTML attachment was scrubbed... URL: From fujisan43 at gmail.com Mon Oct 3 12:34:00 2022 From: fujisan43 at gmail.com (fujisan) Date: Mon, 3 Oct 2022 19:34:00 +0200 Subject: [petsc-users] Differences between main and release branches? In-Reply-To: <5e15d5c2-ab9a-b58c-9f89-852ab0c4371e@mcs.anl.gov> References: <5e15d5c2-ab9a-b58c-9f89-852ab0c4371e@mcs.anl.gov> Message-ID: I probably did: git clone -b release https://gitlab.com/petsc/petsc.git petsc like the documentation says. But I found out that I was in branch main. Cloning 3.17.4 is an abuse of language. I ment cloning petsc when that release was 3.17.4. Anyway I git pulled this morning and checked out branch release. Fuji On Mon, Oct 3, 2022 at 5:29 PM Satish Balay wrote: > What were the git commands used here [for each of these cases]? > > Normally you would checkout a branch - and pull on it. So "cloning 3.17.4" > doesn't really make sense - as there is no "3.17.4" branch. > > you ether checkout a tag - v3.17.4 - then you don't get a branch to pull > on. [sure you can do "git clone -b release" - but that's a branch]. > > So the statement "3.17.4 gave main branch, 3.18.0 gave release branch" > doesn't really make sense to me [from the way git work] > > Satish > > On Mon, 3 Oct 2022, Jose E. Roman wrote: > > > 'main' is the development version, 'release' is the latest release > version. > > You can select the branch when cloning or later with git checkout. > > See https://petsc.org/release/install/download/#recommended-download > > > > Jose > > > > > > > El 3 oct 2022, a las 11:08, fujisan escribi?: > > > > > > Hi everyone, > > > What are the differences between the 'main' and 'release' branches? > > > > > > Where I git cloned version 3.17.4, I was by default in the 'main' > branch. > > > Where I git cloned version 3.18.0 (I haven't git pulled from 3.17.4 > yet), I was by default in the 'release' branch. > > > > > > Fuji > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon Oct 3 12:46:35 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 3 Oct 2022 12:46:35 -0500 (CDT) Subject: [petsc-users] Differences between main and release branches? In-Reply-To: References: <5e15d5c2-ab9a-b58c-9f89-852ab0c4371e@mcs.anl.gov> Message-ID: The only way you would have got 'main' branch is if '-b release' option was missing [or there was some error in specifying it. Either way - 'git checkout release' would set the repo to the desired 'release' branch. Satish On Mon, 3 Oct 2022, fujisan wrote: > I probably did: > > git clone -b release https://gitlab.com/petsc/petsc.git petsc > > like the documentation says. But I found out that I was in branch main. > Cloning 3.17.4 is an abuse of language. I ment cloning petsc when that > release was 3.17.4. > > Anyway I git pulled this morning and checked out branch release. > > Fuji > > > > On Mon, Oct 3, 2022 at 5:29 PM Satish Balay wrote: > > > What were the git commands used here [for each of these cases]? > > > > Normally you would checkout a branch - and pull on it. So "cloning 3.17.4" > > doesn't really make sense - as there is no "3.17.4" branch. > > > > you ether checkout a tag - v3.17.4 - then you don't get a branch to pull > > on. [sure you can do "git clone -b release" - but that's a branch]. > > > > So the statement "3.17.4 gave main branch, 3.18.0 gave release branch" > > doesn't really make sense to me [from the way git work] > > > > Satish > > > > On Mon, 3 Oct 2022, Jose E. Roman wrote: > > > > > 'main' is the development version, 'release' is the latest release > > version. > > > You can select the branch when cloning or later with git checkout. > > > See https://petsc.org/release/install/download/#recommended-download > > > > > > Jose > > > > > > > > > > El 3 oct 2022, a las 11:08, fujisan escribi?: > > > > > > > > Hi everyone, > > > > What are the differences between the 'main' and 'release' branches? > > > > > > > > Where I git cloned version 3.17.4, I was by default in the 'main' > > branch. > > > > Where I git cloned version 3.18.0 (I haven't git pulled from 3.17.4 > > yet), I was by default in the 'release' branch. > > > > > > > > Fuji > > > > > > From fujisan43 at gmail.com Tue Oct 4 02:19:25 2022 From: fujisan43 at gmail.com (fujisan) Date: Tue, 4 Oct 2022 09:19:25 +0200 Subject: [petsc-users] Problem reading and HDF5 binary file Message-ID: Hi everyone, I have written a matrix in an HDF5 binary file without problem using PetscViewerHDF5Open function like this: ! Write if (ishdf5) then PetscCall(PetscViewerHDF5Open(PETSC_COMM_WORLD,trim(filename ),FILE_MODE_WRITE,view,ierr)) else PetscCall(PetscViewerBinaryOpen(PETSC_COMM_WORLD,trim(filename ),FILE_MODE_WRITE,view,ierr)) endif PetscCall(MatView(A,view,ierr)) PetscCall(PetscViewerDestroy(view,ierr)) But when I want to read that HDF5 file like this: ! Read if (ishdf5) then PetscCall(PetscViewerHDF5Open(PETSC_COMM_WORLD,trim(filename ),FILE_MODE_READ,view,ierr)) else PetscCall(PetscViewerBinaryOpen(PETSC_COMM_WORLD,trim(filename ),FILE_MODE_READ,view,ierr)) endif PetscCall(MatCreate(PETSC_COMM_WORLD,A,ierr)) PetscCall(MatSetType(A,MATMPIAIJ,ierr)) PetscCall(MatLoad(A,view,ierr)) PetscCall(PetscViewerDestroy(view,ierr)) I get this kind of error message below. I don't have any problem writing / reading using PetscViewerBinaryOpen, and no problem writing / reading a vector using PetscViewerHDF5Open either. What am I missing ? Fuji [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Unexpected data in file [0]PETSC ERROR: Attribute /Mat_0xc4000016_0/MATLAB_sparse does not exist and default value not provided [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.18.0, unknown [0]PETSC ERROR: ./bin/solve on a x86_64 named master by beauduin Tue Oct 4 08:55:55 2022 [0]PETSC ERROR: Configure options --with-petsc-arch=x86_64 --COPTFLAGS="-g -O3" --FOPTFLAGS="-g -O3" --CXXOPTFLAGS="-g -O3" --with-debugging=0 --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort --with-single-library=1 --with-mpiexec=mpiexec --with-precision=double --with-fortran-interfaces=1 --with-make=1 --with-mpi=1 --with-mpi-compilers=1 --download-fblaslapack=0 --download-hypre=1 --download-cmake=0 --with-cmake=1 --download-metis=1 --download-parmetis=1 --download-ptscotch=0 --download-suitesparse=1 --download-triangle=1 --download-superlu=1 --download-superlu_dist=1 --download-scalapack=1 --download-mumps=1 --download-elemental=1 --download-spai=0 --download-parms=1 --download-moab=1 --download-chaco=0 --download-fftw=1 --with-petsc4py=1 --download-mpi4py=1 --download-saws --download-concurrencykit=1 --download-revolve=1 --download-cams=1 --download-p4est=0 --with-zlib=1 --with-hdf5=1 --download-hdf5=1 --download-mfem=1 --download-glvis=0 --with-opengl=0 --download-libpng=1 --download-libjpeg=1 --download-slepc=1 --download-hpddm=1 --download-bamg=1 --download-mmg=0 --download-parmmg=0 --download-htool=1 --download-egads=0 --download-opencascade=0 PETSC_ARCH=x86_64 [0]PETSC ERROR: #1 PetscViewerHDF5ReadAttribute() at /data/softs/petsc/src/sys/classes/viewer/impls/hdf5/hdf5v.c:1245 [0]PETSC ERROR: #2 MatLoad_AIJ_HDF5() at /data/softs/petsc/src/mat/impls/aij/seq/aijhdf5.c:63 [0]PETSC ERROR: #3 MatLoad_MPIAIJ() at /data/softs/petsc/src/mat/impls/aij/mpi/mpiaij.c:3034 [0]PETSC ERROR: #4 MatLoad() at /data/softs/petsc/src/mat/interface/matrix.c:1304 [0]PETSC ERROR: #5 bigmat.F90:62 [0]PETSC ERROR: #6 MatCreateVecs() at /data/softs/petsc/src/mat/interface/matrix.c:9336 [0]PETSC ERROR: #7 solve.F90:143 Abort(73) on node 0 (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 73) - process 0 forrtl: severe (174): SIGSEGV, segmentation fault occurred Image PC Routine Line Source solve 000000000043258A Unknown Unknown Unknown libpthread-2.28.s 00007FCAF0B78C20 Unknown Unknown Unknown libmpi.so.12.0.0 00007FCAF17A09D3 Unknown Unknown Unknown .... -------------- next part -------------- An HTML attachment was scrubbed... URL: From fujisan43 at gmail.com Tue Oct 4 02:42:03 2022 From: fujisan43 at gmail.com (fujisan) Date: Tue, 4 Oct 2022 09:42:03 +0200 Subject: [petsc-users] Problem reading and HDF5 binary file In-Reply-To: References: Message-ID: It turns out there is nothing in the hdf5 file: $ h5dump data/matrix3.mat.h5 HDF5 "data/matrix3.mat.h5" { GROUP "/" { } } On Tue, Oct 4, 2022 at 9:19 AM fujisan wrote: > Hi everyone, > > I have written a matrix in an HDF5 binary file without problem using > PetscViewerHDF5Open function like this: > > ! Write > if (ishdf5) then > PetscCall(PetscViewerHDF5Open(PETSC_COMM_WORLD,trim(filename > ),FILE_MODE_WRITE,view,ierr)) > else > PetscCall(PetscViewerBinaryOpen(PETSC_COMM_WORLD,trim(filename > ),FILE_MODE_WRITE,view,ierr)) > endif > PetscCall(MatView(A,view,ierr)) > PetscCall(PetscViewerDestroy(view,ierr)) > > But when I want to read that HDF5 file like this: > > ! Read > if (ishdf5) then > PetscCall(PetscViewerHDF5Open(PETSC_COMM_WORLD,trim(filename > ),FILE_MODE_READ,view,ierr)) > else > PetscCall(PetscViewerBinaryOpen(PETSC_COMM_WORLD,trim(filename > ),FILE_MODE_READ,view,ierr)) > endif > PetscCall(MatCreate(PETSC_COMM_WORLD,A,ierr)) > PetscCall(MatSetType(A,MATMPIAIJ,ierr)) > PetscCall(MatLoad(A,view,ierr)) > PetscCall(PetscViewerDestroy(view,ierr)) > > I get this kind of error message below. > I don't have any problem writing / reading using PetscViewerBinaryOpen, > and no problem writing / reading a vector using PetscViewerHDF5Open either. > > What am I missing ? > > Fuji > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Unexpected data in file > [0]PETSC ERROR: Attribute /Mat_0xc4000016_0/MATLAB_sparse does not exist > and default value not provided > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.18.0, unknown > [0]PETSC ERROR: ./bin/solve on a x86_64 named master by beauduin Tue Oct > 4 08:55:55 2022 > [0]PETSC ERROR: Configure options --with-petsc-arch=x86_64 --COPTFLAGS="-g > -O3" --FOPTFLAGS="-g -O3" --CXXOPTFLAGS="-g -O3" --with-debugging=0 > --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort > --with-single-library=1 --with-mpiexec=mpiexec --with-precision=double > --with-fortran-interfaces=1 --with-make=1 --with-mpi=1 > --with-mpi-compilers=1 --download-fblaslapack=0 --download-hypre=1 > --download-cmake=0 --with-cmake=1 --download-metis=1 --download-parmetis=1 > --download-ptscotch=0 --download-suitesparse=1 --download-triangle=1 > --download-superlu=1 --download-superlu_dist=1 --download-scalapack=1 > --download-mumps=1 --download-elemental=1 --download-spai=0 > --download-parms=1 --download-moab=1 --download-chaco=0 --download-fftw=1 > --with-petsc4py=1 --download-mpi4py=1 --download-saws > --download-concurrencykit=1 --download-revolve=1 --download-cams=1 > --download-p4est=0 --with-zlib=1 --with-hdf5=1 --download-hdf5=1 > --download-mfem=1 --download-glvis=0 --with-opengl=0 --download-libpng=1 > --download-libjpeg=1 --download-slepc=1 --download-hpddm=1 > --download-bamg=1 --download-mmg=0 --download-parmmg=0 --download-htool=1 > --download-egads=0 --download-opencascade=0 PETSC_ARCH=x86_64 > [0]PETSC ERROR: #1 PetscViewerHDF5ReadAttribute() at > /data/softs/petsc/src/sys/classes/viewer/impls/hdf5/hdf5v.c:1245 > [0]PETSC ERROR: #2 MatLoad_AIJ_HDF5() at > /data/softs/petsc/src/mat/impls/aij/seq/aijhdf5.c:63 > [0]PETSC ERROR: #3 MatLoad_MPIAIJ() at > /data/softs/petsc/src/mat/impls/aij/mpi/mpiaij.c:3034 > [0]PETSC ERROR: #4 MatLoad() at > /data/softs/petsc/src/mat/interface/matrix.c:1304 > [0]PETSC ERROR: #5 bigmat.F90:62 > [0]PETSC ERROR: #6 MatCreateVecs() at > /data/softs/petsc/src/mat/interface/matrix.c:9336 > [0]PETSC ERROR: #7 solve.F90:143 > Abort(73) on node 0 (rank 0 in comm 16): application called > MPI_Abort(MPI_COMM_SELF, 73) - process 0 > forrtl: severe (174): SIGSEGV, segmentation fault occurred > Image PC Routine Line > Source > solve 000000000043258A Unknown Unknown Unknown > libpthread-2.28.s 00007FCAF0B78C20 Unknown Unknown Unknown > libmpi.so.12.0.0 00007FCAF17A09D3 Unknown Unknown Unknown > .... > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fujisan43 at gmail.com Tue Oct 4 03:24:22 2022 From: fujisan43 at gmail.com (fujisan) Date: Tue, 4 Oct 2022 10:24:22 +0200 Subject: [petsc-users] Problem reading and HDF5 binary file In-Reply-To: References: Message-ID: I see from https://petsc.org/main/docs/manualpages/Mat/MatLoad/ that MatView for HDF5 binary format is not yet implemented. Any idea when this will be implemented? F. Current HDF5 (MAT-File) limitations This reader currently supports only real MATSEQAIJ, MATMPIAIJ, MATSEQDENSE and MATMPIDENSE matrices. Corresponding MatView() is not yet implemented. The loaded matrix is actually a transpose of the original one in MATLAB, unless you push PETSC_VIEWER_HDF5_MAT format (see examples above). With this format, matrix is automatically transposed by PETSc, unless the matrix is marked as SPD or symmetric (see MatSetOption(), MAT_SPD, MAT_SYMMETRIC). On Tue, Oct 4, 2022 at 9:42 AM fujisan wrote: > It turns out there is nothing in the hdf5 file: > > $ h5dump data/matrix3.mat.h5 > HDF5 "data/matrix3.mat.h5" { > GROUP "/" { > } > } > > > On Tue, Oct 4, 2022 at 9:19 AM fujisan wrote: > >> Hi everyone, >> >> I have written a matrix in an HDF5 binary file without problem using >> PetscViewerHDF5Open function like this: >> >> ! Write >> if (ishdf5) then >> PetscCall(PetscViewerHDF5Open(PETSC_COMM_WORLD,trim(filename >> ),FILE_MODE_WRITE,view,ierr)) >> else >> PetscCall(PetscViewerBinaryOpen(PETSC_COMM_WORLD,trim(filename >> ),FILE_MODE_WRITE,view,ierr)) >> endif >> PetscCall(MatView(A,view,ierr)) >> PetscCall(PetscViewerDestroy(view,ierr)) >> >> But when I want to read that HDF5 file like this: >> >> ! Read >> if (ishdf5) then >> PetscCall(PetscViewerHDF5Open(PETSC_COMM_WORLD,trim(filename >> ),FILE_MODE_READ,view,ierr)) >> else >> PetscCall(PetscViewerBinaryOpen(PETSC_COMM_WORLD,trim(filename >> ),FILE_MODE_READ,view,ierr)) >> endif >> PetscCall(MatCreate(PETSC_COMM_WORLD,A,ierr)) >> PetscCall(MatSetType(A,MATMPIAIJ,ierr)) >> PetscCall(MatLoad(A,view,ierr)) >> PetscCall(PetscViewerDestroy(view,ierr)) >> >> I get this kind of error message below. >> I don't have any problem writing / reading using PetscViewerBinaryOpen, >> and no problem writing / reading a vector using PetscViewerHDF5Open either. >> >> What am I missing ? >> >> Fuji >> >> [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> [0]PETSC ERROR: Unexpected data in file >> [0]PETSC ERROR: Attribute /Mat_0xc4000016_0/MATLAB_sparse does not exist >> and default value not provided >> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. >> [0]PETSC ERROR: Petsc Release Version 3.18.0, unknown >> [0]PETSC ERROR: ./bin/solve on a x86_64 named master by beauduin Tue Oct >> 4 08:55:55 2022 >> [0]PETSC ERROR: Configure options --with-petsc-arch=x86_64 >> --COPTFLAGS="-g -O3" --FOPTFLAGS="-g -O3" --CXXOPTFLAGS="-g -O3" >> --with-debugging=0 --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort >> --with-single-library=1 --with-mpiexec=mpiexec --with-precision=double >> --with-fortran-interfaces=1 --with-make=1 --with-mpi=1 >> --with-mpi-compilers=1 --download-fblaslapack=0 --download-hypre=1 >> --download-cmake=0 --with-cmake=1 --download-metis=1 --download-parmetis=1 >> --download-ptscotch=0 --download-suitesparse=1 --download-triangle=1 >> --download-superlu=1 --download-superlu_dist=1 --download-scalapack=1 >> --download-mumps=1 --download-elemental=1 --download-spai=0 >> --download-parms=1 --download-moab=1 --download-chaco=0 --download-fftw=1 >> --with-petsc4py=1 --download-mpi4py=1 --download-saws >> --download-concurrencykit=1 --download-revolve=1 --download-cams=1 >> --download-p4est=0 --with-zlib=1 --with-hdf5=1 --download-hdf5=1 >> --download-mfem=1 --download-glvis=0 --with-opengl=0 --download-libpng=1 >> --download-libjpeg=1 --download-slepc=1 --download-hpddm=1 >> --download-bamg=1 --download-mmg=0 --download-parmmg=0 --download-htool=1 >> --download-egads=0 --download-opencascade=0 PETSC_ARCH=x86_64 >> [0]PETSC ERROR: #1 PetscViewerHDF5ReadAttribute() at >> /data/softs/petsc/src/sys/classes/viewer/impls/hdf5/hdf5v.c:1245 >> [0]PETSC ERROR: #2 MatLoad_AIJ_HDF5() at >> /data/softs/petsc/src/mat/impls/aij/seq/aijhdf5.c:63 >> [0]PETSC ERROR: #3 MatLoad_MPIAIJ() at >> /data/softs/petsc/src/mat/impls/aij/mpi/mpiaij.c:3034 >> [0]PETSC ERROR: #4 MatLoad() at >> /data/softs/petsc/src/mat/interface/matrix.c:1304 >> [0]PETSC ERROR: #5 bigmat.F90:62 >> [0]PETSC ERROR: #6 MatCreateVecs() at >> /data/softs/petsc/src/mat/interface/matrix.c:9336 >> [0]PETSC ERROR: #7 solve.F90:143 >> Abort(73) on node 0 (rank 0 in comm 16): application called >> MPI_Abort(MPI_COMM_SELF, 73) - process 0 >> forrtl: severe (174): SIGSEGV, segmentation fault occurred >> Image PC Routine Line >> Source >> solve 000000000043258A Unknown Unknown >> Unknown >> libpthread-2.28.s 00007FCAF0B78C20 Unknown Unknown >> Unknown >> libmpi.so.12.0.0 00007FCAF17A09D3 Unknown Unknown >> Unknown >> .... >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From yangzongze at gmail.com Tue Oct 4 09:18:52 2022 From: yangzongze at gmail.com (Zongze Yang) Date: Tue, 4 Oct 2022 22:18:52 +0800 Subject: [petsc-users] Is the results of `DMAdaptLabel` as expected in `src/dm/impls/plex/tests/ex20.c` Message-ID: Hi everyone, I am learning how to use the `DMAdaptLabel` for `DMPlex`, and found the example `src/dm/impls/plex/tests/ex20.c` which label one cell to refine. 1. This example is just a uniform refinement when using the following command. (see attached pdfs for the results). ``` [real-int32-gcc] z2yang at ws5:~/opt/firedrake/real-int32-gcc/petsc/src/dm/impls/plex/tests$ ./ex20 -dm_plex_box_faces 3,3 -dm_coord_space 0 -pre_adapt_dm_view ascii::ascii_info -post_adapt_dm_view draw:tikz:figure2.tex ``` Is this expected for this example? 2. I found there is a function named `DMAdaptLabel_Plex`, and `DMAdaptLabel` did not call that function when the type of the dm is `DMPlex`. Is the function `DMAdaptLabel_Plex` still in use? 3. `DMAdaptLabel` seems to lack some useful information when I use the wrong adaptor. For example, if I set `-dm_adaptor mmg`, then the process will give a segment fault because the `metric` passed to `DMAdaptMetric_Mmg_Plex` is NULL, see the output below: ``` [real-int32-gcc] z2yang at ws5:~/opt/firedrake/real-int32-gcc/petsc/src/dm/impls/plex/tests$ ./ex20 -dm_plex_box_faces 3,3 -dm_coord_space 0 -pre_adapt_dm_view draw:tikz:figure1.tex -post_adapt_dm_view draw:tikz:figure2.tex -dm_adaptor mmg [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and https://petsc.org/release/faq/ [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run [0]PETSC ERROR: to get more information on the crash. [0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is causing the crash. Abort(59) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 ``` Thanks, Zongze Yang -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: pre_adapt.pdf Type: application/pdf Size: 2974 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: post_adapt.pdf Type: application/pdf Size: 4023 bytes Desc: not available URL: From knepley at gmail.com Tue Oct 4 11:33:40 2022 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 4 Oct 2022 17:33:40 +0100 Subject: [petsc-users] Is the results of `DMAdaptLabel` as expected in `src/dm/impls/plex/tests/ex20.c` In-Reply-To: References: Message-ID: On Tue, Oct 4, 2022 at 3:19 PM Zongze Yang wrote: > Hi everyone, > > I am learning how to use the `DMAdaptLabel` for `DMPlex`, and found the > example `src/dm/impls/plex/tests/ex20.c` which label one cell to refine. > > 1. This example is just a uniform refinement when using the following > command. (see attached pdfs for the results). > ``` > [real-int32-gcc] z2yang at ws5:~/opt/firedrake/real-int32-gcc/petsc/src/dm/impls/plex/tests$ > ./ex20 -dm_plex_box_faces 3,3 -dm_coord_space 0 -pre_adapt_dm_view > ascii::ascii_info -post_adapt_dm_view draw:tikz:figure2.tex > ``` > Is this expected for this example? > Hi Zongze, Yes, I agree this is not easy to see. If you give -dm_plex_transform_view, you can see the kind of transform being used knepley/pylith $:/PETSc3/petsc/petsc-pylith$ PETSC_ARCH=arch-pylith-opt make -j8 -f ./gmakefile test search="dm_impls_plex_tests-ex20_ 2d" TIMEOUT=5000 EXTRA_OPTIONS="-dm_plex_transform_view" Using MAKEFLAGS: --jobserver-fds=3,4 -j -- EXTRA_OPTIONS=-dm_plex_transform_view TIMEOUT=5000 search=dm_impls_plex_tests-ex20_2d TEST arch-pylith-opt/tests/counts/dm_impls_plex_tests-ex20_2d.counts ok dm_impls_plex_tests-ex20_2d not ok diff-dm_impls_plex_tests-ex20_2d # Error code: 1 # 11a12,14 # > DMPlexTransform Object: 1 MPI process # > type: refine_regular # > Regular refinement DMPlexTransform_0x84000000_1 You can see that it is regular refinement, so it ignores the input and refines everything. If you change it, you can get adaptive refinement, knepley/pylith $:/PETSc3/petsc/petsc-pylith$ PETSC_ARCH=arch-pylith-opt make -j8 -f ./gmakefile test search="dm_impls_plex_tests-ex20_ 2d" TIMEOUT=5000 EXTRA_OPTIONS="-pre_adapt_dm_view draw -post_adapt_dm_view draw -draw_pause -1 -dm_plex_transform_type refine_sbr" I attached the plot. > 2. I found there is a function named `DMAdaptLabel_Plex`, and > `DMAdaptLabel` did not call that function when the type of the dm is > `DMPlex`. Is the function `DMAdaptLabel_Plex` still in use? > No. I rewrote all the transformations last year. I think the new form is much smaller, cleaner, and more performant. I should delete this function, but I am finishing up the review of all adaptive refinement with Joe Wallwork at Imperial. > 3. `DMAdaptLabel` seems to lack some useful information when I use the > wrong adaptor. For example, if I set `-dm_adaptor mmg`, then the process > will give a segment fault because the `metric` passed to > `DMAdaptMetric_Mmg_Plex` is NULL, see the output below: > ``` > [real-int32-gcc] z2yang at ws5:~/opt/firedrake/real-int32-gcc/petsc/src/dm/impls/plex/tests$ > ./ex20 -dm_plex_box_faces 3,3 -dm_coord_space 0 -pre_adapt_dm_view > draw:tikz:figure1.tex -post_adapt_dm_view draw:tikz:figure2.tex -dm_adaptor > mmg > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and > https://petsc.org/release/faq/ > [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and > run > [0]PETSC ERROR: to get more information on the crash. > [0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is > causing the crash. > Abort(59) on node 0 (rank 0 in comm 0): application called > MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > ``` > Hmm, I at least get an error message: # [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- # [0]PETSC ERROR: Null argument, when expecting valid pointer # [0]PETSC ERROR: Null Pointer: Parameter # 1 # [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! # [0]PETSC ERROR: Option left: name:-post_adapt_dm_view value: ascii::ascii_info # [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. # [0]PETSC ERROR: Petsc Development GIT revision: v3.17.4-1472-ga50e2f9f007 GIT Date: 2022-09-23 13:01:31 +0000 # [0]PETSC ERROR: ../ex20 on a arch-master-debug named MacBook-Pro.local by knepley Tue Oct 4 17:29:20 2022 # [0]PETSC ERROR: Configure options --PETSC_ARCH=arch-master-debug --download-bamg --download-bison --download-chaco --download-ctetgen --download-egads --download-eigen --download-exodusii --download-fftw --download-hpddm --download-ks --download-libceed --download-libpng --download-metis --download-ml --download-mmg --download-mumps --download-netcdf --download-opencascade --download-p4est --download-parmetis --download-parmmg --download-pnetcdf --download-pragmatic --download-ptscotch --download-scalapack --download-slepc --download-suitesparse --download-superlu_dist --download-tetgen --download-triangle --with-cmake-exec=/PETSc3/petsc/apple/bin/cmake --with-ctest-exec=/PETSc3/petsc/apple/bin/ctest --with-hdf5-dir=/PETSc3/petsc/apple --with-mpi-dir=/PETSc3/petsc/apple --with-petsc4py=1 --with-shared-libraries --with-slepc --with-zlib --download-muparser # [0]PETSC ERROR: #1 VecViewFromOptions() at /PETSc3/petsc/petsc-dev/src/vec/vec/interface/vector.c:627 # [0]PETSC ERROR: #2 DMAdaptMetric_Mmg_Plex() at /PETSc3/petsc/petsc-dev/src/dm/impls/plex/adaptors/mmg/mmgadapt.c:130 # [0]PETSC ERROR: #3 DMAdaptLabel() at /PETSc3/petsc/petsc-dev/src/dm/interface/dmgenerate.c:179 # [0]PETSC ERROR: #4 main() at /PETSc3/petsc/petsc-dev/src/dm/impls/plex/tests/ex20.c:24 # [0]PETSC ERROR: PETSc Option Table entries: # [0]PETSC ERROR: -dm_adaptor mmg # [0]PETSC ERROR: -dm_coord_space 0 # [0]PETSC ERROR: -dm_plex_box_faces 3,3 # [0]PETSC ERROR: -post_adapt_dm_view ascii::ascii_info # [0]PETSC ERROR: -pre_adapt_dm_view ascii::ascii_info # [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- # application called MPI_Abort(MPI_COMM_SELF, 85) - process 0 ok dm_impls_plex_tests-ex20_2d # SKIP Command failed so no diff I agree that it would be nice to segregate the adaptors into those that work with labels and those that work with metrics, but I thought we could have an automated system to convert between metrics and labels. However, I have not implemented it yet, since I am still trying to figure out exactly how everything should work. Thanks, Matt > Thanks, > Zongze Yang > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Screen Shot 2022-10-04 at 5.24.02 PM.png Type: image/png Size: 89863 bytes Desc: not available URL: From snailsoar at hotmail.com Tue Oct 4 12:03:32 2022 From: snailsoar at hotmail.com (feng wang) Date: Tue, 4 Oct 2022 17:03:32 +0000 Subject: [petsc-users] clarification on extreme eigenvalues from KSPComputeEigenvalues Message-ID: Dear All, I am using the KSPComputeEigenvalues to understand the performance of my preconditioner, and I am using the right-preconditioned GMRES with ASM. In the user guide, it says this routine computes the extreme eigenvalues of the preconditioned operator. If I understand it correctly, these eigenvalues are the ones furthest away from (1,0)? If the preconditioning is perfect, all the eigenvalues should be (1,0). Thanks, Feng -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Tue Oct 4 12:18:23 2022 From: mfadams at lbl.gov (Mark Adams) Date: Tue, 4 Oct 2022 13:18:23 -0400 Subject: [petsc-users] clarification on extreme eigenvalues from KSPComputeEigenvalues In-Reply-To: References: Message-ID: The extreme eigenvalues are the lowest and highest. A perfect preconditioner would give all eigenvalues = 1.0 Mark On Tue, Oct 4, 2022 at 1:03 PM feng wang wrote: > Dear All, > > I am using the KSPComputeEigenvalues to understand the performance of my > preconditioner, and I am using the right-preconditioned GMRES with ASM. In > the user guide, it says this routine computes the extreme eigenvalues of > the preconditioned operator. If I understand it correctly, these > eigenvalues are the ones furthest away from (1,0)? If the preconditioning > is perfect, all the eigenvalues should be (1,0). > > Thanks, > Feng > -------------- next part -------------- An HTML attachment was scrubbed... URL: From snailsoar at hotmail.com Tue Oct 4 16:20:14 2022 From: snailsoar at hotmail.com (feng wang) Date: Tue, 4 Oct 2022 21:20:14 +0000 Subject: [petsc-users] clarification on extreme eigenvalues from KSPComputeEigenvalues In-Reply-To: References: Message-ID: Hi Mark, Thanks for your reply. Below is the output if I call KSPComputeEigenvalues 0.330475 -0.0485014 0.521211 0.417409 0.684726 -0.377126 0.885941 0.354342 0.957845 -0.0508471 0.964676 -0.241642 1.05921 0.0742963 1.82065 -0.0209096 I have the following questions: * These eigenvalues are sorted according to the magnitudes. so "lowest" means smallest magnitude and "highest" means largest magnitude in your previous email? * I understand that if the preconditioner is perfect, all the eigenvalues should be (1,0). Since my preconditioner is not perfect, to understand its performance, is it correct to say that, I need to keep an eye on the eigenvalues whose distance to (1,0) are the furthest? * How does petsc decides how many eigenvalues to output in KSPComputeEigenvalues. I am solving a set of linear systems, sometimes KSPComputeEigenvalues outputs 8 eigenvalues, sometimes it outputs just 2 eigenvalues. * In the output which I showed above, are these the ones with the smallest magnitude and also the ones with the largest magnitudes? and what's between are all ignored? If this is the case, which ones are the "lowest" and which ones are the "highest"? Thanks for your help and sorry for so many questions, Feng ________________________________ From: Mark Adams Sent: 04 October 2022 17:18 To: feng wang Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] clarification on extreme eigenvalues from KSPComputeEigenvalues The extreme eigenvalues are the lowest and highest. A perfect preconditioner would give all eigenvalues = 1.0 Mark On Tue, Oct 4, 2022 at 1:03 PM feng wang > wrote: Dear All, I am using the KSPComputeEigenvalues to understand the performance of my preconditioner, and I am using the right-preconditioned GMRES with ASM. In the user guide, it says this routine computes the extreme eigenvalues of the preconditioned operator. If I understand it correctly, these eigenvalues are the ones furthest away from (1,0)? If the preconditioning is perfect, all the eigenvalues should be (1,0). Thanks, Feng -------------- next part -------------- An HTML attachment was scrubbed... URL: From yangzongze at gmail.com Wed Oct 5 01:39:57 2022 From: yangzongze at gmail.com (Zongze Yang) Date: Wed, 5 Oct 2022 14:39:57 +0800 Subject: [petsc-users] Is the results of `DMAdaptLabel` as expected in `src/dm/impls/plex/tests/ex20.c` In-Reply-To: References: Message-ID: Matthew Knepley ?2022?10?5??? 00:33??? > On Tue, Oct 4, 2022 at 3:19 PM Zongze Yang wrote: > >> Hi everyone, >> >> I am learning how to use the `DMAdaptLabel` for `DMPlex`, and found the >> example `src/dm/impls/plex/tests/ex20.c` which label one cell to refine. >> >> 1. This example is just a uniform refinement when using the following >> command. (see attached pdfs for the results). >> ``` >> [real-int32-gcc] z2yang at ws5:~/opt/firedrake/real-int32-gcc/petsc/src/dm/impls/plex/tests$ >> ./ex20 -dm_plex_box_faces 3,3 -dm_coord_space 0 -pre_adapt_dm_view >> ascii::ascii_info -post_adapt_dm_view draw:tikz:figure2.tex >> ``` >> Is this expected for this example? >> > > Hi Zongze, > > Yes, I agree this is not easy to see. If you give -dm_plex_transform_view, > you can see the kind of transform being used > > knepley/pylith $:/PETSc3/petsc/petsc-pylith$ PETSC_ARCH=arch-pylith-opt > make -j8 -f ./gmakefile test search="dm_impls_plex_tests-ex20_ > 2d" TIMEOUT=5000 EXTRA_OPTIONS="-dm_plex_transform_view" > Using MAKEFLAGS: --jobserver-fds=3,4 -j -- > EXTRA_OPTIONS=-dm_plex_transform_view TIMEOUT=5000 > search=dm_impls_plex_tests-ex20_2d > TEST > arch-pylith-opt/tests/counts/dm_impls_plex_tests-ex20_2d.counts > ok dm_impls_plex_tests-ex20_2d > not ok diff-dm_impls_plex_tests-ex20_2d # Error code: 1 > # 11a12,14 > # > DMPlexTransform Object: 1 MPI process > # > type: refine_regular > # > Regular refinement DMPlexTransform_0x84000000_1 > > You can see that it is regular refinement, so it ignores the input and > refines everything. If you change it, you can get adaptive refinement, > > knepley/pylith $:/PETSc3/petsc/petsc-pylith$ PETSC_ARCH=arch-pylith-opt > make -j8 -f ./gmakefile test search="dm_impls_plex_tests-ex20_ > 2d" TIMEOUT=5000 EXTRA_OPTIONS="-pre_adapt_dm_view draw > -post_adapt_dm_view draw -draw_pause -1 -dm_plex_transform_type refine_sbr" > > I attached the plot. > > Hi Matt, Thanks for your clear explanation. Now, I see that by setting different transform types I can refine the mesh by different algorithms. But why the refinement algorithms are classified as transform? 2. I found there is a function named `DMAdaptLabel_Plex`, and >> `DMAdaptLabel` did not call that function when the type of the dm is >> `DMPlex`. Is the function `DMAdaptLabel_Plex` still in use? >> > > No. I rewrote all the transformations last year. I think the new form is > much smaller, cleaner, and more performant. I should delete this function, > but I am finishing > up the review of all adaptive refinement with Joe Wallwork at Imperial. > > >> 3. `DMAdaptLabel` seems to lack some useful information when I use the >> wrong adaptor. For example, if I set `-dm_adaptor mmg`, then the process >> will give a segment fault because the `metric` passed to >> `DMAdaptMetric_Mmg_Plex` is NULL, see the output below: >> ``` >> [real-int32-gcc] z2yang at ws5:~/opt/firedrake/real-int32-gcc/petsc/src/dm/impls/plex/tests$ >> ./ex20 -dm_plex_box_faces 3,3 -dm_coord_space 0 -pre_adapt_dm_view >> draw:tikz:figure1.tex -post_adapt_dm_view draw:tikz:figure2.tex -dm_adaptor >> mmg >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >> probably memory access out of range >> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger >> [0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and >> https://petsc.org/release/faq/ >> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, >> and run >> [0]PETSC ERROR: to get more information on the crash. >> [0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is >> causing the crash. >> Abort(59) on node 0 (rank 0 in comm 0): application called >> MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >> ``` >> > > Hmm, I at least get an error message: > > # [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > # [0]PETSC ERROR: Null argument, when expecting valid pointer > # [0]PETSC ERROR: Null Pointer: Parameter # 1 > # [0]PETSC ERROR: WARNING! There are option(s) set that were not > used! Could be the program crashed before they were used or a spelling > mistake, etc! > # [0]PETSC ERROR: Option left: name:-post_adapt_dm_view value: > ascii::ascii_info > # [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble > shooting. > # [0]PETSC ERROR: Petsc Development GIT revision: > v3.17.4-1472-ga50e2f9f007 GIT Date: 2022-09-23 13:01:31 +0000 > # [0]PETSC ERROR: ../ex20 on a arch-master-debug named > MacBook-Pro.local by knepley Tue Oct 4 17:29:20 2022 > # [0]PETSC ERROR: Configure options --PETSC_ARCH=arch-master-debug > --download-bamg --download-bison --download-chaco --download-ctetgen > --download-egads --download-eigen --download-exodusii --download-fftw > --download-hpddm --download-ks --download-libceed --download-libpng > --download-metis --download-ml --download-mmg --download-mumps > --download-netcdf --download-opencascade --download-p4est > --download-parmetis --download-parmmg --download-pnetcdf > --download-pragmatic --download-ptscotch --download-scalapack > --download-slepc --download-suitesparse --download-superlu_dist > --download-tetgen --download-triangle > --with-cmake-exec=/PETSc3/petsc/apple/bin/cmake > --with-ctest-exec=/PETSc3/petsc/apple/bin/ctest > --with-hdf5-dir=/PETSc3/petsc/apple --with-mpi-dir=/PETSc3/petsc/apple > --with-petsc4py=1 --with-shared-libraries --with-slepc --with-zlib > --download-muparser > # [0]PETSC ERROR: #1 VecViewFromOptions() at > /PETSc3/petsc/petsc-dev/src/vec/vec/interface/vector.c:627 > # [0]PETSC ERROR: #2 DMAdaptMetric_Mmg_Plex() at > /PETSc3/petsc/petsc-dev/src/dm/impls/plex/adaptors/mmg/mmgadapt.c:130 > # [0]PETSC ERROR: #3 DMAdaptLabel() at > /PETSc3/petsc/petsc-dev/src/dm/interface/dmgenerate.c:179 > # [0]PETSC ERROR: #4 main() at > /PETSc3/petsc/petsc-dev/src/dm/impls/plex/tests/ex20.c:24 > # [0]PETSC ERROR: PETSc Option Table entries: > # [0]PETSC ERROR: -dm_adaptor mmg > # [0]PETSC ERROR: -dm_coord_space 0 > # [0]PETSC ERROR: -dm_plex_box_faces 3,3 > # [0]PETSC ERROR: -post_adapt_dm_view ascii::ascii_info > # [0]PETSC ERROR: -pre_adapt_dm_view ascii::ascii_info > # [0]PETSC ERROR: ----------------End of Error Message -------send > entire error message to petsc-maint at mcs.anl.gov---------- > # application called MPI_Abort(MPI_COMM_SELF, 85) - process 0 > ok dm_impls_plex_tests-ex20_2d # SKIP Command failed so no diff > I should use `--with-debugging=yes` when configuring petsc for more information. > I agree that it would be nice to segregate the adaptors into those that > work with labels and those that work with metrics, but I thought we could > have an automated system to convert between metrics and labels. However, I > have not implemented it yet, since I am still trying to figure out exactly > how everything should work. > That would be really nice! Thanks, Zongze Thanks, > > Matt > > >> Thanks, >> Zongze Yang >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Oct 5 03:38:14 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 5 Oct 2022 09:38:14 +0100 Subject: [petsc-users] Is the results of `DMAdaptLabel` as expected in `src/dm/impls/plex/tests/ex20.c` In-Reply-To: References: Message-ID: On Wed, Oct 5, 2022 at 7:40 AM Zongze Yang wrote: > Matthew Knepley ?2022?10?5??? 00:33??? > >> On Tue, Oct 4, 2022 at 3:19 PM Zongze Yang wrote: >> >>> Hi everyone, >>> >>> I am learning how to use the `DMAdaptLabel` for `DMPlex`, and found the >>> example `src/dm/impls/plex/tests/ex20.c` which label one cell to refine. >>> >>> 1. This example is just a uniform refinement when using the following >>> command. (see attached pdfs for the results). >>> ``` >>> [real-int32-gcc] z2yang at ws5:~/opt/firedrake/real-int32-gcc/petsc/src/dm/impls/plex/tests$ >>> ./ex20 -dm_plex_box_faces 3,3 -dm_coord_space 0 -pre_adapt_dm_view >>> ascii::ascii_info -post_adapt_dm_view draw:tikz:figure2.tex >>> ``` >>> Is this expected for this example? >>> >> >> Hi Zongze, >> >> Yes, I agree this is not easy to see. If you give >> -dm_plex_transform_view, you can see the kind of transform being used >> >> knepley/pylith $:/PETSc3/petsc/petsc-pylith$ PETSC_ARCH=arch-pylith-opt >> make -j8 -f ./gmakefile test search="dm_impls_plex_tests-ex20_ >> 2d" TIMEOUT=5000 EXTRA_OPTIONS="-dm_plex_transform_view" >> Using MAKEFLAGS: --jobserver-fds=3,4 -j -- >> EXTRA_OPTIONS=-dm_plex_transform_view TIMEOUT=5000 >> search=dm_impls_plex_tests-ex20_2d >> TEST >> arch-pylith-opt/tests/counts/dm_impls_plex_tests-ex20_2d.counts >> ok dm_impls_plex_tests-ex20_2d >> not ok diff-dm_impls_plex_tests-ex20_2d # Error code: 1 >> # 11a12,14 >> # > DMPlexTransform Object: 1 MPI process >> # > type: refine_regular >> # > Regular refinement DMPlexTransform_0x84000000_1 >> >> You can see that it is regular refinement, so it ignores the input and >> refines everything. If you change it, you can get adaptive refinement, >> >> knepley/pylith $:/PETSc3/petsc/petsc-pylith$ PETSC_ARCH=arch-pylith-opt >> make -j8 -f ./gmakefile test search="dm_impls_plex_tests-ex20_ >> 2d" TIMEOUT=5000 EXTRA_OPTIONS="-pre_adapt_dm_view draw >> -post_adapt_dm_view draw -draw_pause -1 -dm_plex_transform_type refine_sbr" >> >> I attached the plot. >> >> > > Hi Matt, > > Thanks for your clear explanation. Now, I see that by setting different > transform types I can refine the mesh by different algorithms. But why the > refinement algorithms are classified as transform? > They used to be separately implemented as mesh refinements. However, last year, I figured out how to compute many kinds of regular refinement, some adaptive refinement, extrusion, filtering, and change of cell type using a common algorithm. This is the purpose of DMPlexTransform. I think I will be able to encompass even more soon. For example, I am almost finished with refined meshes that respond to all queries, but are never actually stored. Thanks, Matt > 2. I found there is a function named `DMAdaptLabel_Plex`, and >>> `DMAdaptLabel` did not call that function when the type of the dm is >>> `DMPlex`. Is the function `DMAdaptLabel_Plex` still in use? >>> >> >> No. I rewrote all the transformations last year. I think the new form is >> much smaller, cleaner, and more performant. I should delete this function, >> but I am finishing >> up the review of all adaptive refinement with Joe Wallwork at Imperial. >> >> >>> 3. `DMAdaptLabel` seems to lack some useful information when I use the >>> wrong adaptor. For example, if I set `-dm_adaptor mmg`, then the process >>> will give a segment fault because the `metric` passed to >>> `DMAdaptMetric_Mmg_Plex` is NULL, see the output below: >>> ``` >>> [real-int32-gcc] z2yang at ws5:~/opt/firedrake/real-int32-gcc/petsc/src/dm/impls/plex/tests$ >>> ./ex20 -dm_plex_box_faces 3,3 -dm_coord_space 0 -pre_adapt_dm_view >>> draw:tikz:figure1.tex -post_adapt_dm_view draw:tikz:figure2.tex -dm_adaptor >>> mmg >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>> probably memory access out of range >>> [0]PETSC ERROR: Try option -start_in_debugger or >>> -on_error_attach_debugger >>> [0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and >>> https://petsc.org/release/faq/ >>> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, >>> and run >>> [0]PETSC ERROR: to get more information on the crash. >>> [0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is >>> causing the crash. >>> Abort(59) on node 0 (rank 0 in comm 0): application called >>> MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>> ``` >>> >> >> Hmm, I at least get an error message: >> >> # [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> # [0]PETSC ERROR: Null argument, when expecting valid pointer >> # [0]PETSC ERROR: Null Pointer: Parameter # 1 >> # [0]PETSC ERROR: WARNING! There are option(s) set that were not >> used! Could be the program crashed before they were used or a spelling >> mistake, etc! >> # [0]PETSC ERROR: Option left: name:-post_adapt_dm_view value: >> ascii::ascii_info >> # [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >> shooting. >> # [0]PETSC ERROR: Petsc Development GIT revision: >> v3.17.4-1472-ga50e2f9f007 GIT Date: 2022-09-23 13:01:31 +0000 >> # [0]PETSC ERROR: ../ex20 on a arch-master-debug named >> MacBook-Pro.local by knepley Tue Oct 4 17:29:20 2022 >> # [0]PETSC ERROR: Configure options --PETSC_ARCH=arch-master-debug >> --download-bamg --download-bison --download-chaco --download-ctetgen >> --download-egads --download-eigen --download-exodusii --download-fftw >> --download-hpddm --download-ks --download-libceed --download-libpng >> --download-metis --download-ml --download-mmg --download-mumps >> --download-netcdf --download-opencascade --download-p4est >> --download-parmetis --download-parmmg --download-pnetcdf >> --download-pragmatic --download-ptscotch --download-scalapack >> --download-slepc --download-suitesparse --download-superlu_dist >> --download-tetgen --download-triangle >> --with-cmake-exec=/PETSc3/petsc/apple/bin/cmake >> --with-ctest-exec=/PETSc3/petsc/apple/bin/ctest >> --with-hdf5-dir=/PETSc3/petsc/apple --with-mpi-dir=/PETSc3/petsc/apple >> --with-petsc4py=1 --with-shared-libraries --with-slepc --with-zlib >> --download-muparser >> # [0]PETSC ERROR: #1 VecViewFromOptions() at >> /PETSc3/petsc/petsc-dev/src/vec/vec/interface/vector.c:627 >> # [0]PETSC ERROR: #2 DMAdaptMetric_Mmg_Plex() at >> /PETSc3/petsc/petsc-dev/src/dm/impls/plex/adaptors/mmg/mmgadapt.c:130 >> # [0]PETSC ERROR: #3 DMAdaptLabel() at >> /PETSc3/petsc/petsc-dev/src/dm/interface/dmgenerate.c:179 >> # [0]PETSC ERROR: #4 main() at >> /PETSc3/petsc/petsc-dev/src/dm/impls/plex/tests/ex20.c:24 >> # [0]PETSC ERROR: PETSc Option Table entries: >> # [0]PETSC ERROR: -dm_adaptor mmg >> # [0]PETSC ERROR: -dm_coord_space 0 >> # [0]PETSC ERROR: -dm_plex_box_faces 3,3 >> # [0]PETSC ERROR: -post_adapt_dm_view ascii::ascii_info >> # [0]PETSC ERROR: -pre_adapt_dm_view ascii::ascii_info >> # [0]PETSC ERROR: ----------------End of Error Message -------send >> entire error message to petsc-maint at mcs.anl.gov---------- >> # application called MPI_Abort(MPI_COMM_SELF, 85) - process 0 >> ok dm_impls_plex_tests-ex20_2d # SKIP Command failed so no diff >> > > I should use `--with-debugging=yes` when configuring petsc for more > information. > > >> I agree that it would be nice to segregate the adaptors into those that >> work with labels and those that work with metrics, but I thought we could >> have an automated system to convert between metrics and labels. However, >> I have not implemented it yet, since I am still trying to figure out exactly >> how everything should work. >> > That would be really nice! > > Thanks, > Zongze > > Thanks, >> >> Matt >> >> >>> Thanks, >>> Zongze Yang >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ashishbhole07 at gmail.com Wed Oct 5 06:39:26 2022 From: ashishbhole07 at gmail.com (ashish bhole) Date: Wed, 5 Oct 2022 13:39:26 +0200 Subject: [petsc-users] code with TS throws error at the end Message-ID: Hi All, I am writing a code in Fortran to solve a linear advection equation using PETSc 3.18.0 (Vec and TS). It seems to work fine on my HP elitebook laptop with Fedora 30 OS and GCC 9.3.1. It gives acceptable numerical solutions, but throws the following error at the end, with as well as without parallel computing. The same error also appears with the slightly older version I tried: PETSc 3.14.0. The error message gives a hint for error locations, but I am unable to figure out what is wrong. I have attached a snippet for my TS usage lines. I spent some time searching for similar error reports but it was not so fruitful. So I am approaching the PETSc community for help understanding this error. Thank you. ------------------------------------ [0]PETSC ERROR: PetscTrFreeDefault() called from VecDestroy_Seq() at /home/abhole/lib/petsc-3.18.0/src/vec/vec/impls/seq/bvec2.c:753 [0]PETSC ERROR: Block [id=2154(800)] at address 0x2309ac0 is corrupted (probably write past end of array) [0]PETSC ERROR: Block allocated in VecCreate_Seq() at /home/abhole/lib/petsc-3.18.0/src/vec/vec/impls/seq/bvec3.c:34 [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Memory corruption: https://petsc.org/release/faq/#valgrind [0]PETSC ERROR: Corrupted memory [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.18.0, Sep 30, 2022 [0]PETSC ERROR: ./exe on a arch-linux-c-debug named ischia by abhole Wed Oct 5 13:06:27 2022 [0]PETSC ERROR: Configure options --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --download-fblaslapack --download-scalapack --download-mumps --download-superlu --download-ptscotch --with-metis-include=/user/abhole/home/lib/metis-5.1.0/include --with-metis-lib=/user/abhole/home/lib/metis-5.1.0/lib/libmetis.a -lmetis --with-parmetis-include=/user/abhole/home/lib/parmetis-4.0.3/include --with-parmetis-lib=/user/abhole/home/lib/parmetis-4.0.3/lib/libparmetis.a -lparmetis -lmetis --with-hdf5-include=/user/abhole/home/lib/hdf5-1.8.18/include --with-hdf5-lib=/user/abhole/home/lib/hdf5-1.8.18/lib64/libhdf5.a --with-valgrind=1 --with-scalar-type=real --with-precision=double [0]PETSC ERROR: #1 PetscTrFreeDefault() at /home/abhole/lib/petsc-3.18.0/src/sys/memory/mtr.c:305 [0]PETSC ERROR: #2 VecDestroy_Seq() at /home/abhole/lib/petsc-3.18.0/src/vec/vec/impls/seq/bvec2.c:753 [0]PETSC ERROR: #3 VecDestroy() at /home/abhole/lib/petsc-3.18.0/src/vec/vec/interface/vector.c:521 [0]PETSC ERROR: #4 VecDestroyVecs_Default() at /home/abhole/lib/petsc-3.18.0/src/vec/vec/interface/vector.c:977 [0]PETSC ERROR: #5 VecDestroyVecs() at /home/abhole/lib/petsc-3.18.0/src/vec/vec/interface/vector.c:606 [0]PETSC ERROR: #6 TSRKTableauReset() at /home/abhole/lib/petsc-3.18.0/src/ts/impls/explicit/rk/rk.c:1102 [0]PETSC ERROR: #7 TSReset_RK() at /home/abhole/lib/petsc-3.18.0/src/ts/impls/explicit/rk/rk.c:1109 [0]PETSC ERROR: #8 TSReset() at /home/abhole/lib/petsc-3.18.0/src/ts/interface/ts.c:2644 [0]PETSC ERROR: #9 TSDestroy() at /home/abhole/lib/petsc-3.18.0/src/ts/interface/ts.c:2706 [0]PETSC ERROR: #10 main.F90:159 ------------------------------------------ -- With Regards Ashish Bhole -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: petsc_ts.png Type: image/png Size: 141058 bytes Desc: not available URL: From bsmith at petsc.dev Wed Oct 5 08:29:30 2022 From: bsmith at petsc.dev (Barry Smith) Date: Wed, 5 Oct 2022 09:29:30 -0400 Subject: [petsc-users] code with TS throws error at the end In-Reply-To: References: Message-ID: <72121D47-735E-4B24-AC26-491C4365EE2B@petsc.dev> Can you try running with valgrind? https://petsc.org/release/faq/?highlight=valgrind#what-does-corrupt-argument-or-caught-signal-or-segv-or-segmentation-violation-or-bus-error-mean-can-i-use-valgrind-or-cuda-memcheck-to-debug-memory-corruption-issues Barry > On Oct 5, 2022, at 7:39 AM, ashish bhole wrote: > > Hi All, > > I am writing a code in Fortran to solve a linear advection equation using PETSc 3.18.0 (Vec and TS). It seems to work fine on my HP elitebook laptop with Fedora 30 OS and GCC 9.3.1. It gives acceptable numerical solutions, but throws the following error at the end, with as well as without parallel computing. The same error also appears with the slightly older version I tried: PETSc 3.14.0. > > The error message gives a hint for error locations, but I am unable to figure out what is wrong. I have attached a snippet for my TS usage lines. I spent some time searching for similar error reports but it was not so fruitful. So I am approaching the PETSc community for help understanding this error. > Thank you. > > ------------------------------------ > [0]PETSC ERROR: PetscTrFreeDefault() called from VecDestroy_Seq() at /home/abhole/lib/petsc-3.18.0/src/vec/vec/impls/seq/bvec2.c:753 > [0]PETSC ERROR: Block [id=2154(800)] at address 0x2309ac0 is corrupted (probably write past end of array) > [0]PETSC ERROR: Block allocated in VecCreate_Seq() at /home/abhole/lib/petsc-3.18.0/src/vec/vec/impls/seq/bvec3.c:34 > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Memory corruption: https://petsc.org/release/faq/#valgrind > [0]PETSC ERROR: Corrupted memory > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.18.0, Sep 30, 2022 > [0]PETSC ERROR: ./exe on a arch-linux-c-debug named ischia by abhole Wed Oct 5 13:06:27 2022 > [0]PETSC ERROR: Configure options --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --download-fblaslapack --download-scalapack --download-mumps --download-superlu --download-ptscotch --with-metis-include=/user/abhole/home/lib/metis-5.1.0/include --with-metis-lib=/user/abhole/home/lib/metis-5.1.0/lib/libmetis.a -lmetis --with-parmetis-include=/user/abhole/home/lib/parmetis-4.0.3/include --with-parmetis-lib=/user/abhole/home/lib/parmetis-4.0.3/lib/libparmetis.a -lparmetis -lmetis --with-hdf5-include=/user/abhole/home/lib/hdf5-1.8.18/include --with-hdf5-lib=/user/abhole/home/lib/hdf5-1.8.18/lib64/libhdf5.a --with-valgrind=1 --with-scalar-type=real --with-precision=double > [0]PETSC ERROR: #1 PetscTrFreeDefault() at /home/abhole/lib/petsc-3.18.0/src/sys/memory/mtr.c:305 > [0]PETSC ERROR: #2 VecDestroy_Seq() at /home/abhole/lib/petsc-3.18.0/src/vec/vec/impls/seq/bvec2.c:753 > [0]PETSC ERROR: #3 VecDestroy() at /home/abhole/lib/petsc-3.18.0/src/vec/vec/interface/vector.c:521 > [0]PETSC ERROR: #4 VecDestroyVecs_Default() at /home/abhole/lib/petsc-3.18.0/src/vec/vec/interface/vector.c:977 > [0]PETSC ERROR: #5 VecDestroyVecs() at /home/abhole/lib/petsc-3.18.0/src/vec/vec/interface/vector.c:606 > [0]PETSC ERROR: #6 TSRKTableauReset() at /home/abhole/lib/petsc-3.18.0/src/ts/impls/explicit/rk/rk.c:1102 > [0]PETSC ERROR: #7 TSReset_RK() at /home/abhole/lib/petsc-3.18.0/src/ts/impls/explicit/rk/rk.c:1109 > [0]PETSC ERROR: #8 TSReset() at /home/abhole/lib/petsc-3.18.0/src/ts/interface/ts.c:2644 > [0]PETSC ERROR: #9 TSDestroy() at /home/abhole/lib/petsc-3.18.0/src/ts/interface/ts.c:2706 > [0]PETSC ERROR: #10 main.F90:159 > ------------------------------------------ > > -- With Regards > Ashish Bhole > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Wed Oct 5 10:05:18 2022 From: mfadams at lbl.gov (Mark Adams) Date: Wed, 5 Oct 2022 11:05:18 -0400 Subject: [petsc-users] clarification on extreme eigenvalues from KSPComputeEigenvalues In-Reply-To: References: Message-ID: On Tue, Oct 4, 2022 at 5:20 PM feng wang wrote: > Hi Mark, > > Thanks for your reply. Below is the output if I call KSPComputeEigenvalues > > 0.330475 -0.0485014 > 0.521211 0.417409 > 0.684726 -0.377126 > 0.885941 0.354342 > 0.957845 -0.0508471 > 0.964676 -0.241642 > 1.05921 0.0742963 > 1.82065 -0.0209096 > > I have the following questions: > > - These eigenvalues are sorted according to the magnitudes. so > "lowest" means smallest magnitude and "highest" means largest magnitude in > your previous email? > > Oh, I was talking about "extreme" eigen values (an option). This is all of them. > > - I understand that if the preconditioner is perfect, all the > eigenvalues should be (1,0). Since my preconditioner is not perfect, to > understand its performance, is it correct to say that, I need to keep an > eye on the eigenvalues whose distance to (1,0) are the furthest? > > I'm not sure what you mean by "distance to (1,0)". First, these are the iegenvalues of the sysetm that Krylov project to. They are within the bounds of the true extreme eigenvalues but they are not eigenues of the actuall preconditioned system I just look at the ratio of the highest to lowest. The condition number. This will converge to the true value from below. > > - > - How does petsc decides how many eigenvalues to output in > KSPComputeEigenvalues. > > It is all of them for the projected system, which is the size of the number of iterations. > > - I am solving a set of linear systems, sometimes > KSPComputeEigenvalues outputs 8 eigenvalues, sometimes it outputs just 2 > eigenvalues. > - In the output which I showed above, are these the ones with the > smallest magnitude and also the ones with the largest magnitudes? and > what's between are all ignored? If this is the case, which ones are the > "lowest" and which ones are the "highest"? > > These seem to be sorted. You can also ask for "Extreme" eigenvalues and just get these two that you can use for the condition number estimate. That is the most common use. Mark > > - > > Thanks for your help and sorry for so many questions, > Feng > > > > > ------------------------------ > *From:* Mark Adams > *Sent:* 04 October 2022 17:18 > *To:* feng wang > *Cc:* petsc-users at mcs.anl.gov > *Subject:* Re: [petsc-users] clarification on extreme eigenvalues from > KSPComputeEigenvalues > > The extreme eigenvalues are the lowest and highest. > A perfect preconditioner would give all eigenvalues = 1.0 > > Mark > > On Tue, Oct 4, 2022 at 1:03 PM feng wang wrote: > > Dear All, > > I am using the KSPComputeEigenvalues to understand the performance of my > preconditioner, and I am using the right-preconditioned GMRES with ASM. In > the user guide, it says this routine computes the extreme eigenvalues of > the preconditioned operator. If I understand it correctly, these > eigenvalues are the ones furthest away from (1,0)? If the preconditioning > is perfect, all the eigenvalues should be (1,0). > > Thanks, > Feng > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From junming.duan at epfl.ch Wed Oct 5 15:39:32 2022 From: junming.duan at epfl.ch (Duan Junming) Date: Wed, 5 Oct 2022 20:39:32 +0000 Subject: [petsc-users] About Q3 tensor product Hermite element Message-ID: Dear all, I need to use Q3 tensor product Hermite element in 2D (point value, gradient, and mixed derivative at 4 vertices in a cell as unknowns). Is it available in PETSc FEM module now? I found that only Lagrange element is available. If not, what is the correct path to implement Q3 tensor product Hermite element? I think I should create my own petscspace and petscdualspace? Or is there any package that has already provided this? Thanks for any suggestions! -------------- next part -------------- An HTML attachment was scrubbed... URL: From sajidsyed2021 at u.northwestern.edu Wed Oct 5 15:47:35 2022 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Wed, 5 Oct 2022 15:47:35 -0500 Subject: [petsc-users] PetscLogView produces nan's instead of timing data when using GPUs Message-ID: Hi PETSc-developers, I'm having trouble with getting performance logs from an application that uses PETSc. There are no issues when I run it on a CPU, but every time a GPU is used there is no timing data and almost all times are replaced by times that are just `nan` (on two different clusters). I am attaching the log files for both cases with this email. Could someone explain what is happening here ? In case it helps, here are the routines used to initialize/finalize the application that also handle initializing/finalizing PETSc and printing the PETSc performance logs to PETSC_VIEWER_STDOUT_WORLD : https://github.com/fnalacceleratormodeling/synergia2/blob/devel3/src/synergia/utils/utils.h Thank You, Sajid Ali (he/him) | Research Associate Scientific Computing Division Fermi National Accelerator Laboratory s-sajid-ali.github.io -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: log-gpu Type: application/octet-stream Size: 21553 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: log-cpu Type: application/octet-stream Size: 15947 bytes Desc: not available URL: From bsmith at petsc.dev Wed Oct 5 16:11:22 2022 From: bsmith at petsc.dev (Barry Smith) Date: Wed, 5 Oct 2022 17:11:22 -0400 Subject: [petsc-users] PetscLogView produces nan's instead of timing data when using GPUs In-Reply-To: References: Message-ID: <84E8ED1B-48DB-4FFA-9EC4-8AEEBCECF068@petsc.dev> It prints Nan to indicate that the time for that event is not known accurately. But the times for the larger events that contain these events are known. So for example the time for KSPSolve is know but not the time for VecNorm. The other numbers in the events, like number of times called etc that are not Nan are correct as displayed. This is done because correctly tracking the times of the individual events requires synchronizations that slow down the entire calculation a bit; for example the time for the KSPSolve will register a longer time then it registers if the smaller events are not timed. To display the times of the smaller events use -log_view_gpu_time also but note this will increase the times of the larger events a bit. Barry > On Oct 5, 2022, at 4:47 PM, Sajid Ali wrote: > > Hi PETSc-developers, > > I'm having trouble with getting performance logs from an application that uses PETSc. There are no issues when I run it on a CPU, but every time a GPU is used there is no timing data and almost all times are replaced by times that are just `nan` (on two different clusters). I am attaching the log files for both cases with this email. Could someone explain what is happening here ? > > In case it helps, here are the routines used to initialize/finalize the application that also handle initializing/finalizing PETSc and printing the PETSc performance logs to PETSC_VIEWER_STDOUT_WORLD : https://github.com/fnalacceleratormodeling/synergia2/blob/devel3/src/synergia/utils/utils.h > > Thank You, > Sajid Ali (he/him) | Research Associate > Scientific Computing Division > Fermi National Accelerator Laboratory > s-sajid-ali.github.io -------------- next part -------------- An HTML attachment was scrubbed... URL: From Eric.Chamberland at giref.ulaval.ca Wed Oct 5 17:13:21 2022 From: Eric.Chamberland at giref.ulaval.ca (Eric Chamberland) Date: Thu, 6 Oct 2022 00:13:21 +0200 Subject: [petsc-users] R: How to use Intel OneApi mpi wrappers on Linux In-Reply-To: References: Message-ID: Hi, fwiw, I tried to compile with ipcx too, without mpi wrappers... However, I had other problems... check here: https://gitlab.com/petsc/petsc/-/issues/1255 Anyone have compiled PETSc with the latest Intel OneAPI release? Can you give a working configure line? Thanks, Eric On 2022-10-03 15:58, Paolo Lampitella wrote: > > Hi Barry, > > thanks for the suggestion. I tried this but doesn?t seem to work as > expected. That is, configure actually works, but it is because it is > not seeing the LLVM based compilers, only the intel classical ones. > Yet the variables seem correctly exported. > > Paolo > > *Da: *Barry Smith > *Inviato: *luned? 3 ottobre 2022 15:19 > *A: *Paolo Lampitella > *Cc: *petsc-users at mcs.anl.gov > *Oggetto: *Re: [petsc-users] How to use Intel OneApi mpi wrappers on Linux > > bsmith at petsc-01:~$ mpicc > > This script invokes an appropriate specialized C MPI compiler driver. > > The following ways (priority order) can be used for changing default > > compiler name (gcc): > > 1. Command line option:? -cc= > > 2. Environment variable:?I_MPI_CC?(current value '') > > 3. Environment variable: MPICH_CC (current value '') > > > > So > > export?I_MPI_CC=icx > > export I_MPI_CXX=icpx > > export I_MPI_FC=ifx > > should do the trick. > > > > On Oct 3, 2022, at 5:43 AM, Paolo Lampitella > wrote: > > Dear PETSc users and developers, > > as per the title, I recently installed the base and HPC Intel > OneApi toolkits on a machine running Ubuntu 20.04. > > As you probably know, OneApi comes with the classical compilers > (icc, icpc, ifort) and relative mpi wrappers (mpiicc, mpiicpc, > mpiifort) as well as with the new LLVM based compilers (icx, icpx, > ifx). > > My experience so far with PETSc on Linux has been without troubles > using both gcc compilers and either Mpich or OpenMPI and Intel > classical compilers and MPI. > > However, I have now troubles using the MPI wrappers of the new > LLVM compilers as, in fact, there aren?t dedicated mpi wrappers > for them. Instead, they can be used with certain flags for the > classical wrappers: > > mpiicc -cc=icx > > mpiicpc -cxx=icpx > > mpiifort -fc=ifx > > The problem I have is that I have no idea how to pass them > correctly to the configure and whatever comes after that. > > Admittedly, I am just starting to use the new compilers, so I have > no clue how I would use them in other projects as well. > > I started with an alias in my .bash_aliases (which works for > simple compilation tests from command line) but doesn?t with > configure. > > I also tried adding the flags to the COPTFLAGS, CXXOPTFLAGS and > FOPTFLAGS but didn?t work as well. > > Do you have any experience with the new Intel compilers and, in > case, could you share hot to properly use them with MPI? > > Thanks > > Paolo > -- Eric Chamberland, ing., M. Ing Professionnel de recherche GIREF/Universit? Laval (418) 656-2131 poste 41 22 42 -------------- next part -------------- An HTML attachment was scrubbed... URL: From yc17470 at connect.um.edu.mo Thu Oct 6 00:47:23 2022 From: yc17470 at connect.um.edu.mo (Gong Yujie) Date: Thu, 6 Oct 2022 05:47:23 +0000 Subject: [petsc-users] Vector field ordering question Message-ID: Dear development team, I'm trying to write a code to deal with a multi-field problem. Currently I find that the vector ordering for the field is (a1,a2,a3,b1,b2,b3), here assume a1,a2,a3 belongs to one field and bs for another field. Can I get a point-block ordering for the unknowns as (a1,b1,a2,b2,a3,b3)? I'm using DMPlex for the mesh management and first use PetscFECreateDefault to create the PetscFE object, then use this object with DMAddField to add the field to the DMPlex object. DMCreateGlobalVector is used for generating vectors. I do the discretization myself instead of using PetscFE or PetscFV. My PETSc version is 3.16.0. Do you have a case using this point-block type data structure? Best Regards, Jerry -------------- next part -------------- An HTML attachment was scrubbed... URL: From yangzongze at gmail.com Thu Oct 6 03:16:04 2022 From: yangzongze at gmail.com (Zongze Yang) Date: Thu, 6 Oct 2022 16:16:04 +0800 Subject: [petsc-users] How to show the x window for cmd `make -f ./gmakefile test ...`? Message-ID: Hi, everyone, I am trying to run some test cases with x window, but the x window never showed up with command `make -f ./gmakefile test ...`. It seems a default option `-nox` is set. How to disable this option for `make test`? An example is shown below: ``` z2yang at ws6:~/repos/petsc$ PETSC_ARCH=arch-main-debug make -f ./gmakefile test search="dm_impls_plex_tests-ex20_2d" TIMEOUT=5000 EXTRA_OPTIONS="-post_adapt_dm_view draw:x -draw_pause -1 -options_view -petsc_ci false" Using MAKEFLAGS: -- EXTRA_OPTIONS=-post_adapt_dm_view draw:x -draw_pause -1 -options_view -petsc_ci false TIMEOUT=5000 search=dm_impls_plex_tests-ex20_2d TEST arch-main-debug/tests/counts/dm_impls_plex_tests-ex20_2d.counts ok dm_impls_plex_tests-ex20_2d not ok diff-dm_impls_plex_tests-ex20_2d # Error code: 1 # 12,22c12,26 # < DM Object: Post Adaptation Mesh 1 MPI process # < type: plex # < Post Adaptation Mesh in 2 dimensions: # < Number of 0-cells per rank: 49 # < Number of 1-cells per rank: 120 # < Number of 2-cells per rank: 72 # < Labels: # < celltype: 3 strata with value/size (1 (120), 3 (72), 0 (49)) # < depth: 3 strata with value/size (0 (49), 1 (120), 2 (72)) # < marker: 1 strata with value/size (1 (48)) # < Face Sets: 1 strata with value/size (1 (36)) # --- # > #PETSc Option Table entries: # > -check_pointer_intensity 0 # > -dm_coord_space 0 # > -dm_plex_box_faces 3,3 # > -draw_pause -1 # > -error_output_stdout # > -malloc_dump # > -nox # > -nox_warning # > -options_view # > -petsc_ci false # > -post_adapt_dm_view draw:x # > -pre_adapt_dm_view ascii::ascii_info # > -use_gpu_aware_mpi 0 # > #End of PETSc Option Table entries # FAILED diff-dm_impls_plex_tests-ex20_2d # # To rerun failed tests: # /usr/bin/gmake -f gmakefile test test-fail=1 ``` Thanks, Zongze -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Oct 6 03:19:39 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 6 Oct 2022 09:19:39 +0100 Subject: [petsc-users] How to show the x window for cmd `make -f ./gmakefile test ...`? In-Reply-To: References: Message-ID: On Thu, Oct 6, 2022 at 9:16 AM Zongze Yang wrote: > Hi, everyone, > > I am trying to run some test cases with x window, but the x window never > showed up with command `make -f ./gmakefile test ...`. It seems a default > option `-nox` is set. How to disable this option for `make test`? > Yes, we disable it for tests by default in order to make the CI efficient. You can edit $PETSC_DIR/$PETSC_ARCH/lib/petsc/conf/petscvariables and remove it from PETSC_TEST_OPTIONS, which should be the last line. Thanks, Matt > An example is shown below: > ``` > z2yang at ws6:~/repos/petsc$ PETSC_ARCH=arch-main-debug make -f ./gmakefile > test search="dm_impls_plex_tests-ex20_2d" TIMEOUT=5000 > EXTRA_OPTIONS="-post_adapt_dm_view draw:x -draw_pause -1 -options_view > -petsc_ci false" > Using MAKEFLAGS: -- EXTRA_OPTIONS=-post_adapt_dm_view draw:x -draw_pause > -1 -options_view -petsc_ci false TIMEOUT=5000 > search=dm_impls_plex_tests-ex20_2d > TEST > arch-main-debug/tests/counts/dm_impls_plex_tests-ex20_2d.counts > ok dm_impls_plex_tests-ex20_2d > not ok diff-dm_impls_plex_tests-ex20_2d # Error code: 1 > # 12,22c12,26 > # < DM Object: Post Adaptation Mesh 1 MPI process > # < type: plex > # < Post Adaptation Mesh in 2 dimensions: > # < Number of 0-cells per rank: 49 > # < Number of 1-cells per rank: 120 > # < Number of 2-cells per rank: 72 > # < Labels: > # < celltype: 3 strata with value/size (1 (120), 3 (72), 0 (49)) > # < depth: 3 strata with value/size (0 (49), 1 (120), 2 (72)) > # < marker: 1 strata with value/size (1 (48)) > # < Face Sets: 1 strata with value/size (1 (36)) > # --- > # > #PETSc Option Table entries: > # > -check_pointer_intensity 0 > # > -dm_coord_space 0 > # > -dm_plex_box_faces 3,3 > # > -draw_pause -1 > # > -error_output_stdout > # > -malloc_dump > # > -nox > # > -nox_warning > # > -options_view > # > -petsc_ci false > # > -post_adapt_dm_view draw:x > # > -pre_adapt_dm_view ascii::ascii_info > # > -use_gpu_aware_mpi 0 > # > #End of PETSc Option Table entries > > > # FAILED diff-dm_impls_plex_tests-ex20_2d > # > # To rerun failed tests: > # /usr/bin/gmake -f gmakefile test test-fail=1 > ``` > > Thanks, > Zongze > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From yangzongze at gmail.com Thu Oct 6 03:22:02 2022 From: yangzongze at gmail.com (Zongze Yang) Date: Thu, 6 Oct 2022 16:22:02 +0800 Subject: [petsc-users] How to show the x window for cmd `make -f ./gmakefile test ...`? In-Reply-To: References: Message-ID: Thanks! Matthew Knepley ?2022?10?6??? 16:19??? > On Thu, Oct 6, 2022 at 9:16 AM Zongze Yang wrote: > >> Hi, everyone, >> >> I am trying to run some test cases with x window, but the x window never >> showed up with command `make -f ./gmakefile test ...`. It seems a default >> option `-nox` is set. How to disable this option for `make test`? >> > > Yes, we disable it for tests by default in order to make the CI efficient. > You can edit > > $PETSC_DIR/$PETSC_ARCH/lib/petsc/conf/petscvariables > > and remove it from PETSC_TEST_OPTIONS, which should be the last line. > > Thanks, > > Matt > > >> An example is shown below: >> ``` >> z2yang at ws6:~/repos/petsc$ PETSC_ARCH=arch-main-debug make -f ./gmakefile >> test search="dm_impls_plex_tests-ex20_2d" TIMEOUT=5000 >> EXTRA_OPTIONS="-post_adapt_dm_view draw:x -draw_pause -1 -options_view >> -petsc_ci false" >> Using MAKEFLAGS: -- EXTRA_OPTIONS=-post_adapt_dm_view draw:x -draw_pause >> -1 -options_view -petsc_ci false TIMEOUT=5000 >> search=dm_impls_plex_tests-ex20_2d >> TEST >> arch-main-debug/tests/counts/dm_impls_plex_tests-ex20_2d.counts >> ok dm_impls_plex_tests-ex20_2d >> not ok diff-dm_impls_plex_tests-ex20_2d # Error code: 1 >> # 12,22c12,26 >> # < DM Object: Post Adaptation Mesh 1 MPI process >> # < type: plex >> # < Post Adaptation Mesh in 2 dimensions: >> # < Number of 0-cells per rank: 49 >> # < Number of 1-cells per rank: 120 >> # < Number of 2-cells per rank: 72 >> # < Labels: >> # < celltype: 3 strata with value/size (1 (120), 3 (72), 0 (49)) >> # < depth: 3 strata with value/size (0 (49), 1 (120), 2 (72)) >> # < marker: 1 strata with value/size (1 (48)) >> # < Face Sets: 1 strata with value/size (1 (36)) >> # --- >> # > #PETSc Option Table entries: >> # > -check_pointer_intensity 0 >> # > -dm_coord_space 0 >> # > -dm_plex_box_faces 3,3 >> # > -draw_pause -1 >> # > -error_output_stdout >> # > -malloc_dump >> # > -nox >> # > -nox_warning >> # > -options_view >> # > -petsc_ci false >> # > -post_adapt_dm_view draw:x >> # > -pre_adapt_dm_view ascii::ascii_info >> # > -use_gpu_aware_mpi 0 >> # > #End of PETSc Option Table entries >> >> >> # FAILED diff-dm_impls_plex_tests-ex20_2d >> # >> # To rerun failed tests: >> # /usr/bin/gmake -f gmakefile test test-fail=1 >> ``` >> >> Thanks, >> Zongze >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Oct 6 03:22:09 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 6 Oct 2022 09:22:09 +0100 Subject: [petsc-users] Vector field ordering question In-Reply-To: References: Message-ID: On Thu, Oct 6, 2022 at 6:47 AM Gong Yujie wrote: > Dear development team, > > I'm trying to write a code to deal with a multi-field problem. Currently I > find that the vector ordering for the field is (a1,a2,a3,b1,b2,b3), here > assume a1,a2,a3 belongs to one field and bs for another field. *Can I get > a point-block ordering for the unknowns as (a1,b1,a2,b2,a3,b3)?* > > I'm using DMPlex for the mesh management and first > use PetscFECreateDefault to create the PetscFE object, then use this object > with DMAddField to add the field to the DMPlex object. DMCreateGlobalVector > is used for generating vectors. I do the discretization myself instead of > using PetscFE or PetscFV. My PETSc version is 3.16.0. Do you have a case > using this point-block type data structure? > PetscFE does point-block ordering by default. How do you decide where a given dof goes (since you discretize yourself)? Thanks, Matt > Best Regards, > Jerry > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From paololampitella at hotmail.com Thu Oct 6 05:46:52 2022 From: paololampitella at hotmail.com (Paolo Lampitella) Date: Thu, 6 Oct 2022 10:46:52 +0000 Subject: [petsc-users] R: R: How to use Intel OneApi mpi wrappers on Linux In-Reply-To: References: Message-ID: Hi Eric, With the previous Intel version I was able to configure without mpi wrappers without problems. Using the suggestion by Mark (CFLAGS, FFLAGS, CXXFLAGS) I managed to also use the mpi wrappers. Unfortunately, as you seem to have noticed, things break down on Hypre and that loopopt. I have a lead on a possible solution being to use Autoconf 2.7 or higher, but this is untested. However, in an attempt to clarify the procedure better, I started from scratch and got trapped in the new intel version, which has now deprecated the classical C/C++ compiler, and passing ?-diag-disable=10441? in the C/CXX FLAGS is not working for me. So, as a matter of fact, I am stacked too and had to abandon the intel route for the moment Paolo Inviato da Posta per Windows Da: Eric Chamberland Inviato: gioved? 6 ottobre 2022 00:13 A: Paolo Lampitella; Barry Smith Cc: petsc-users at mcs.anl.gov Oggetto: Re: [petsc-users] R: How to use Intel OneApi mpi wrappers on Linux Hi, fwiw, I tried to compile with ipcx too, without mpi wrappers... However, I had other problems... check here: https://gitlab.com/petsc/petsc/-/issues/1255 Anyone have compiled PETSc with the latest Intel OneAPI release? Can you give a working configure line? Thanks, Eric On 2022-10-03 15:58, Paolo Lampitella wrote: Hi Barry, thanks for the suggestion. I tried this but doesn?t seem to work as expected. That is, configure actually works, but it is because it is not seeing the LLVM based compilers, only the intel classical ones. Yet the variables seem correctly exported. Paolo Da: Barry Smith Inviato: luned? 3 ottobre 2022 15:19 A: Paolo Lampitella Cc: petsc-users at mcs.anl.gov Oggetto: Re: [petsc-users] How to use Intel OneApi mpi wrappers on Linux bsmith at petsc-01:~$ mpicc This script invokes an appropriate specialized C MPI compiler driver. The following ways (priority order) can be used for changing default compiler name (gcc): 1. Command line option: -cc= 2. Environment variable: I_MPI_CC (current value '') 3. Environment variable: MPICH_CC (current value '') So export I_MPI_CC=icx export I_MPI_CXX=icpx export I_MPI_FC=ifx should do the trick. On Oct 3, 2022, at 5:43 AM, Paolo Lampitella > wrote: Dear PETSc users and developers, as per the title, I recently installed the base and HPC Intel OneApi toolkits on a machine running Ubuntu 20.04. As you probably know, OneApi comes with the classical compilers (icc, icpc, ifort) and relative mpi wrappers (mpiicc, mpiicpc, mpiifort) as well as with the new LLVM based compilers (icx, icpx, ifx). My experience so far with PETSc on Linux has been without troubles using both gcc compilers and either Mpich or OpenMPI and Intel classical compilers and MPI. However, I have now troubles using the MPI wrappers of the new LLVM compilers as, in fact, there aren?t dedicated mpi wrappers for them. Instead, they can be used with certain flags for the classical wrappers: mpiicc -cc=icx mpiicpc -cxx=icpx mpiifort -fc=ifx The problem I have is that I have no idea how to pass them correctly to the configure and whatever comes after that. Admittedly, I am just starting to use the new compilers, so I have no clue how I would use them in other projects as well. I started with an alias in my .bash_aliases (which works for simple compilation tests from command line) but doesn?t with configure. I also tried adding the flags to the COPTFLAGS, CXXOPTFLAGS and FOPTFLAGS but didn?t work as well. Do you have any experience with the new Intel compilers and, in case, could you share hot to properly use them with MPI? Thanks Paolo -- Eric Chamberland, ing., M. Ing Professionnel de recherche GIREF/Universit? Laval (418) 656-2131 poste 41 22 42 -------------- next part -------------- An HTML attachment was scrubbed... URL: From snailsoar at hotmail.com Thu Oct 6 06:44:31 2022 From: snailsoar at hotmail.com (feng wang) Date: Thu, 6 Oct 2022 11:44:31 +0000 Subject: [petsc-users] clarification on extreme eigenvalues from KSPComputeEigenvalues In-Reply-To: References: Message-ID: Hi Mark, Thanks for your help! It clears many of my doubts. Thanks, Feng ________________________________ From: Mark Adams Sent: 05 October 2022 15:05 To: feng wang Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] clarification on extreme eigenvalues from KSPComputeEigenvalues On Tue, Oct 4, 2022 at 5:20 PM feng wang > wrote: Hi Mark, Thanks for your reply. Below is the output if I call KSPComputeEigenvalues 0.330475 -0.0485014 0.521211 0.417409 0.684726 -0.377126 0.885941 0.354342 0.957845 -0.0508471 0.964676 -0.241642 1.05921 0.0742963 1.82065 -0.0209096 I have the following questions: * These eigenvalues are sorted according to the magnitudes. so "lowest" means smallest magnitude and "highest" means largest magnitude in your previous email? Oh, I was talking about "extreme" eigen values (an option). This is all of them. * I understand that if the preconditioner is perfect, all the eigenvalues should be (1,0). Since my preconditioner is not perfect, to understand its performance, is it correct to say that, I need to keep an eye on the eigenvalues whose distance to (1,0) are the furthest? I'm not sure what you mean by "distance to (1,0)". First, these are the iegenvalues of the sysetm that Krylov project to. They are within the bounds of the true extreme eigenvalues but they are not eigenues of the actuall preconditioned system I just look at the ratio of the highest to lowest. The condition number. This will converge to the true value from below. * * How does petsc decides how many eigenvalues to output in KSPComputeEigenvalues. It is all of them for the projected system, which is the size of the number of iterations. * I am solving a set of linear systems, sometimes KSPComputeEigenvalues outputs 8 eigenvalues, sometimes it outputs just 2 eigenvalues. * In the output which I showed above, are these the ones with the smallest magnitude and also the ones with the largest magnitudes? and what's between are all ignored? If this is the case, which ones are the "lowest" and which ones are the "highest"? These seem to be sorted. You can also ask for "Extreme" eigenvalues and just get these two that you can use for the condition number estimate. That is the most common use. Mark * Thanks for your help and sorry for so many questions, Feng ________________________________ From: Mark Adams > Sent: 04 October 2022 17:18 To: feng wang > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] clarification on extreme eigenvalues from KSPComputeEigenvalues The extreme eigenvalues are the lowest and highest. A perfect preconditioner would give all eigenvalues = 1.0 Mark On Tue, Oct 4, 2022 at 1:03 PM feng wang > wrote: Dear All, I am using the KSPComputeEigenvalues to understand the performance of my preconditioner, and I am using the right-preconditioned GMRES with ASM. In the user guide, it says this routine computes the extreme eigenvalues of the preconditioned operator. If I understand it correctly, these eigenvalues are the ones furthest away from (1,0)? If the preconditioning is perfect, all the eigenvalues should be (1,0). Thanks, Feng -------------- next part -------------- An HTML attachment was scrubbed... URL: From sajidsyed2021 at u.northwestern.edu Thu Oct 6 14:04:04 2022 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Thu, 6 Oct 2022 14:04:04 -0500 Subject: [petsc-users] PetscLogView produces nan's instead of timing data when using GPUs In-Reply-To: <84E8ED1B-48DB-4FFA-9EC4-8AEEBCECF068@petsc.dev> References: <84E8ED1B-48DB-4FFA-9EC4-8AEEBCECF068@petsc.dev> Message-ID: Hi Barry, Thanks for the explanation. On Wed, Oct 5, 2022 at 4:11 PM Barry Smith wrote: > > It prints Nan to indicate that the time for that event is not known > accurately. But the times for the larger events that contain these events > are known. So for example the time for KSPSolve is know but not the time > for VecNorm. The other numbers in the events, like number of times called > etc that are not Nan are correct as displayed. > > This is done because correctly tracking the times of the individual > events requires synchronizations that slow down the entire calculation a > bit; for example the time for the KSPSolve will register a longer time then > it registers if the smaller events are not timed. > > To display the times of the smaller events use -log_view_gpu_time also > but note this will increase the times of the larger events a bit. > > Barry > > > On Oct 5, 2022, at 4:47 PM, Sajid Ali > wrote: > > Hi PETSc-developers, > > I'm having trouble with getting performance logs from an application that > uses PETSc. There are no issues when I run it on a CPU, but every time a > GPU is used there is no timing data and almost all times are replaced by > times that are just `nan` (on two different clusters). I am attaching the > log files for both cases with this email. Could someone explain what is > happening here ? > > In case it helps, here are the routines used to initialize/finalize the > application that also handle initializing/finalizing PETSc and printing the > PETSc performance logs to PETSC_VIEWER_STDOUT_WORLD : > https://github.com/fnalacceleratormodeling/synergia2/blob/devel3/src/synergia/utils/utils.h > > Thank You, > Sajid Ali (he/him) | Research Associate > Scientific Computing Division > Fermi National Accelerator Laboratory > s-sajid-ali.github.io > > > > -- Thank You, Sajid Ali (he/him) | Research Associate Scientific Computing Division Fermi National Accelerator Laboratory s-sajid-ali.github.io -------------- next part -------------- An HTML attachment was scrubbed... URL: From sajidsyed2021 at u.northwestern.edu Thu Oct 6 14:31:40 2022 From: sajidsyed2021 at u.northwestern.edu (Sajid Ali) Date: Thu, 6 Oct 2022 14:31:40 -0500 Subject: [petsc-users] Regarding the status of MatSolve on GPUs Message-ID: Hi PETSc-developers, Does PETSc currently provide (either native or third party support) for MatSolve that can be performed entirely on a GPU given a factored matrix? i.e. a direct solver that would store the factors L and U on the device and use the GPU to solve the linear system. It does not matter if the GPU is not used for the factorization as we intend to solve the same linear system for 100s of iterations and thus try to prevent GPU->CPU transfers for the MatSolve phase. Currently, I've built PETSc at main (commit 9c433d, 10/03) with superlu-dist at develop, both of which are configured with CUDA. With this, I'm seeing that each call to PCApply/MatSolve involves one GPU->CPU transfer. Is it possible to avoid this? Thank You, Sajid Ali (he/him) | Research Associate Scientific Computing Division Fermi National Accelerator Laboratory s-sajid-ali.github.io -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Thu Oct 6 20:47:23 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Thu, 6 Oct 2022 20:47:23 -0500 Subject: [petsc-users] Regarding the status of MatSolve on GPUs In-Reply-To: References: Message-ID: Hi, Sajid, I will have a look to see what is wrong. Thanks. --Junchao Zhang On Thu, Oct 6, 2022 at 2:32 PM Sajid Ali wrote: > Hi PETSc-developers, > > Does PETSc currently provide (either native or third party support) for > MatSolve that can be performed entirely on a GPU given a factored matrix? > i.e. a direct solver that would store the factors L and U on the device and > use the GPU to solve the linear system. It does not matter if the GPU is > not used for the factorization as we intend to solve the same linear system > for 100s of iterations and thus try to prevent GPU->CPU transfers for the > MatSolve phase. > > Currently, I've built PETSc at main (commit 9c433d, 10/03) with > superlu-dist at develop, both of which are configured with CUDA. With this, > I'm seeing that each call to PCApply/MatSolve involves one GPU->CPU > transfer. Is it possible to avoid this? > > Thank You, > Sajid Ali (he/him) | Research Associate > Scientific Computing Division > Fermi National Accelerator Laboratory > s-sajid-ali.github.io > -------------- next part -------------- An HTML attachment was scrubbed... URL: From snailsoar at hotmail.com Fri Oct 7 11:48:11 2022 From: snailsoar at hotmail.com (feng wang) Date: Fri, 7 Oct 2022 16:48:11 +0000 Subject: [petsc-users] Slepc, shell matrix, parallel, halo exchange In-Reply-To: References: <53363D7B-CCBD-4DAB-924E-1D5D56975828@dsic.upv.es> <76162134-CDE9-42B9-8310-D9DD33D2F12D@dsic.upv.es> Message-ID: Hi Mat, I've tried the suggested approach. The halo cells are not exchanged somehow. Below is how I do it, have I missed anything? I create a ghost vector petsc_dcsv and it is a data member of the class cFdDomain, which is a context of the shell matrix. PetscCall(VecCreateGhostBlock(*A_COMM_WORLD, blocksize, blocksize*nlocal, PETSC_DECIDE ,nghost, ighost, &petsc_dcsv)); blocksize and nv have the same value. nlocal is number of local cells and nghost is number of halo cells. ighost contains the ghost cell index. Below is how I compute a matrix-vector product with a shell matrix PetscErrorCode cFdDomain::mymult_slepc(Mat m ,Vec x, Vec y) { void *ctx; cFdDomain *myctx; PetscErrorCode ierr; MatShellGetContext(m, &ctx); myctx = (cFdDomain*)ctx; //matrix-vector product ierr = myctx->myfunc(x, y); CHKERRQ(ierr); ierr = 0; return ierr; } PetscErrorCode cFdDomain::myfunc(Vec in, Vec out) { //some declaration ierr = VecGetArray(petsc_dcsv,&array_g); CHKERRQ(ierr); ierr = VecGetArrayRead(in, &array); CHKERRQ(ierr); //assign in to petsc_dcsv, only local cells for(iv=0; iv Sent: 21 September 2022 14:36 To: feng wang Cc: Jose E. Roman ; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange On Wed, Sep 21, 2022 at 10:35 AM feng wang > wrote: Hi Jose, For your 2nd suggestion on halo exchange, I get the idea and roughly know how to do it, but there are some implementation details which I am not quite sure. If I understand it correctly, in MatMult(Mat m ,Vec x, Vec y), Vec x is a normal parallel vector and it does not contain halo values. Suppose I create an auxiliary ghost vector x_g, then I assign the values of x to x_g. The values of the halo for each partition will not be assigned at this stage. But If I call VecGhostUpdateBegin/End(x_g, INSERT_VALUES, SCATTER_FORWARD), this will fill the values of the halo cells of x_g for each partition. Then x_g has local and halo cells assigned correctly and I can use x_g to do my computation. Is this what you mean? Yes Matt Thanks, Feng ________________________________ From: Jose E. Roman > Sent: 21 September 2022 13:07 To: feng wang > Cc: Matthew Knepley >; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > El 21 sept 2022, a las 14:47, feng wang > escribi?: > > Thanks Jose, I will try this and will come back to this thread if I have any issue. > > Besides, for EPSGetEigenpair, I guess each rank gets its portion of the eigenvector, and I need to put them together afterwards? Eigenvectors are stored in parallel vectors, which are used in subsequent parallel computation in most applications. If for some reason you need to gather them in a single MPI process you can use e.g. VecScatterCreateToZero() > > Thanks, > Feng > > From: Jose E. Roman > > Sent: 21 September 2022 12:34 > To: feng wang > > Cc: Matthew Knepley >; petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > If you define the MATOP_CREATE_VECS operation in your shell matrix so that it creates a ghost vector, then all vectors within EPS will be ghost vectors, including those that are received as arguments of MatMult(). Not sure if this will work. > > A simpler solution is that you store a ghost vector in the context of your shell matrix, and then in MatMult() you receive a regular parallel vector x, then update the ghost points using the auxiliary ghost vector, do the computation and store the result in the regular parallel vector y. > > Jose > > > > El 21 sept 2022, a las 14:09, feng wang > escribi?: > > > > Thanks for your reply. > > > > For GMRES, I create a ghost vector and give it to KSPSolve. For Slepc, it only takes the shell matrix for EPSSetOperators. Suppose the shell matrix of the eigensolver defines MatMult(Mat m ,Vec x, Vec y), how does it know Vec x is a ghost vector and how many ghost cells there are? > > > > Thanks, > > Feng > > From: Matthew Knepley > > > Sent: 21 September 2022 11:58 > > To: feng wang > > > Cc: petsc-users at mcs.anl.gov > > > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > > > On Wed, Sep 21, 2022 at 7:41 AM feng wang > wrote: > > Hello, > > > > I am using Slepc with a shell matrix. The sequential version seems working and now I am trying to make it run in parallel. > > > > The partition of the domain is done, I am not sure how to do the halo exchange in the shell matrix in Slepc. I have a parallel version of matrix-free GMRES in my code with Petsc. I was using VecCreateGhostBlock to create vector with ghost cells, and then used VecGhostUpdateBegin/End for the halo exchange in the shell matrix, would this be the same for Slepc? > > > > That will be enough for the MatMult(). You would also have to use a SLEPc EPS that only needed MatMult(). > > > > Thanks, > > > > Matt > > > > Thanks, > > Feng > > > > > > > > > > -- > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From rk3199 at columbia.edu Fri Oct 7 12:40:44 2022 From: rk3199 at columbia.edu (Rob Kudyba) Date: Fri, 7 Oct 2022 13:40:44 -0400 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check Message-ID: We are on RHEL 8, using modules that we can load/unload various version of packages/libraries, and I have OpenMPI 4.1.1 with CUDA aware loaded along with GDAL 3.3.0, GCC 10.2.0, and cmake 3.22.1 make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug check fails with the below errors, Running check examples to verify correct installation Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-c-debug Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process See https://petsc.org/release/faq/ -------------------------------------------------------------------------- The library attempted to open the following supporting CUDA libraries, but each of them failed. CUDA-aware support is disabled. libcuda.so.1: cannot open shared object file: No such file or directory libcuda.dylib: cannot open shared object file: No such file or directory /usr/lib64/libcuda.so.1: cannot open shared object file: No such file or directory /usr/lib64/libcuda.dylib: cannot open shared object file: No such file or directory If you are not interested in CUDA-aware support, then run with --mca opal_warn_on_missing_libcuda 0 to suppress this message. If you are interested in CUDA-aware support, then try setting LD_LIBRARY_PATH to the location of libcuda.so.1 to get passed this issue. -------------------------------------------------------------------------- -------------------------------------------------------------------------- WARNING: There was an error initializing an OpenFabrics device. Local host: g117 Local device: mlx5_0 -------------------------------------------------------------------------- lid velocity = 0.0016, prandtl # = 1., grashof # = 1. Number of SNES iterations = 2 Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes See https://petsc.org/release/faq/ The library attempted to open the following supporting CUDA libraries, but each of them failed. CUDA-aware support is disabled. libcuda.so.1: cannot open shared object file: No such file or directory libcuda.dylib: cannot open shared object file: No such file or directory /usr/lib64/libcuda.so.1: cannot open shared object file: No such file or directory /usr/lib64/libcuda.dylib: cannot open shared object file: No such file or directory If you are not interested in CUDA-aware support, then run with --mca opal_warn_on_missing_libcuda 0 to suppress this message. If you are interested in CUDA-aware support, then try setting LD_LIBRARY_PATH to the locationof libcuda.so.1 to get passed this issue. WARNING: There was an error initializing an OpenFabrics device. Local host: xxx Local device: mlx5_0 lid velocity = 0.0016, prandtl # = 1., grashof # = 1. Number of SNES iterations = 2 [g117:4162783] 1 more process has sent help message help-mpi-common-cuda.txt / dlopen failed [g117:4162783] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages [g117:4162783] 1 more process has sent help message help-mpi-btl-openib.txt / error in device init Completed test examples Error while running make check gmake[1]: *** [makefile:149: check] Error 1 make: *** [GNUmakefile:17: check] Error 2 Where is $MPI_RUN set? I'd like to be able to pass options such as --mca orte_base_help_aggregate 0 --mca opal_warn_on_missing_libcuda 0 -mca pml ucx --mca btl '^openib' which will help me troubleshoot and hide unneeded warnings. Thanks, Rob -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Fri Oct 7 12:52:57 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 7 Oct 2022 12:52:57 -0500 (CDT) Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: References: Message-ID: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> you can try make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug MPIEXEC="mpiexec -mca orte_base_help_aggregate 0 --mca opal_warn_on_missing_libcuda 0 -mca pml ucx --mca btl '^openib'" Wrt configure - it can be set with --with-mpiexec option - its saved in PETSC_ARCH/lib/petsc/conf/petscvariables Satish On Fri, 7 Oct 2022, Rob Kudyba wrote: > We are on RHEL 8, using modules that we can load/unload various version of > packages/libraries, and I have OpenMPI 4.1.1 with CUDA aware loaded along > with GDAL 3.3.0, GCC 10.2.0, and cmake 3.22.1 > > make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug check > fails with the below errors, > Running check examples to verify correct installation > > Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-c-debug > Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process > See https://petsc.org/release/faq/ > -------------------------------------------------------------------------- > The library attempted to open the following supporting CUDA libraries, > but each of them failed. CUDA-aware support is disabled. > libcuda.so.1: cannot open shared object file: No such file or directory > libcuda.dylib: cannot open shared object file: No such file or directory > /usr/lib64/libcuda.so.1: cannot open shared object file: No such file or > directory > /usr/lib64/libcuda.dylib: cannot open shared object file: No such file or > directory > If you are not interested in CUDA-aware support, then run with > --mca opal_warn_on_missing_libcuda 0 to suppress this message. If you are > interested > in CUDA-aware support, then try setting LD_LIBRARY_PATH to the location > of libcuda.so.1 to get passed this issue. > -------------------------------------------------------------------------- > -------------------------------------------------------------------------- > WARNING: There was an error initializing an OpenFabrics device. > > Local host: g117 > Local device: mlx5_0 > -------------------------------------------------------------------------- > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > Number of SNES iterations = 2 > Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes > See https://petsc.org/release/faq/ > > The library attempted to open the following supporting CUDA libraries, > but each of them failed. CUDA-aware support is disabled. > libcuda.so.1: cannot open shared object file: No such file or directory > libcuda.dylib: cannot open shared object file: No such file or directory > /usr/lib64/libcuda.so.1: cannot open shared object file: No such file or > directory > /usr/lib64/libcuda.dylib: cannot open shared object file: No such file or > directory > If you are not interested in CUDA-aware support, then run with > --mca opal_warn_on_missing_libcuda 0 to suppress this message. If you are > interested in CUDA-aware support, then try setting LD_LIBRARY_PATH to the > locationof libcuda.so.1 to get passed this issue. > > WARNING: There was an error initializing an OpenFabrics device. > > Local host: xxx > Local device: mlx5_0 > > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > Number of SNES iterations = 2 > [g117:4162783] 1 more process has sent help message > help-mpi-common-cuda.txt / dlopen failed > [g117:4162783] Set MCA parameter "orte_base_help_aggregate" to 0 to see all > help / error messages > [g117:4162783] 1 more process has sent help message help-mpi-btl-openib.txt > / error in device init > Completed test examples > Error while running make check > gmake[1]: *** [makefile:149: check] Error 1 > make: *** [GNUmakefile:17: check] Error 2 > > Where is $MPI_RUN set? I'd like to be able to pass options such as --mca > orte_base_help_aggregate 0 --mca opal_warn_on_missing_libcuda 0 -mca pml > ucx --mca btl '^openib' which will help me troubleshoot and hide unneeded > warnings. > > Thanks, > Rob > From rk3199 at columbia.edu Fri Oct 7 13:08:24 2022 From: rk3199 at columbia.edu (Rob Kudyba) Date: Fri, 7 Oct 2022 14:08:24 -0400 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> References: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> Message-ID: Thanks for the quick reply. I added these options to make and make check still produce the warnings so I used the command like this: make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug MPIEXEC="mpiexec -mca orte_base_help_aggregate 0 --mca opal_warn_on_missing_libcuda 0 -mca pml ucx --mca btl '^openib'" check Running check examples to verify correct installation Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-c-debug C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI processes Completed test examples Could be useful for the FAQ. I'm not trying to use PetSC to compile and linking appears to go awry: [ 58%] Building CXX object CMakeFiles/wtm.dir/src/update_effective_storativity.cpp.o [ 62%] Linking CXX static library libwtm.a [ 62%] Built target wtm [ 66%] Building CXX object CMakeFiles/wtm.x.dir/src/WTM.cpp.o [ 70%] Linking CXX executable wtm.x /usr/bin/ld: cannot find -lpetsc collect2: error: ld returned 1 exit status make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 make[1]: *** [CMakeFiles/Makefile2:269: CMakeFiles/wtm.x.dir/all] Error 2 make: *** [Makefile:136: all] Error 2 Is there an environment variable I'm missing? I've seen the suggestion to add it to LD_LIBRARY_PATH which I did with export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$PETSC_DIR/$PETSC_ARCH/lib and that points to: ls -l /path/to/petsc/arch-linux-c-debug/lib total 83732 lrwxrwxrwx 1 rk3199 user 18 Oct 7 13:56 libpetsc.so -> libpetsc.so.3.18.0 lrwxrwxrwx 1 rk3199 user 18 Oct 7 13:56 libpetsc.so.3.18 -> libpetsc.so.3.18.0 -rwxr-xr-x 1 rk3199 user 85719200 Oct 7 13:56 libpetsc.so.3.18.0 drwxr-xr-x 3 rk3199 user 4096 Oct 6 10:22 petsc drwxr-xr-x 2 rk3199 user 4096 Oct 6 10:23 pkgconfig Anything else to check? On Fri, Oct 7, 2022 at 1:53 PM Satish Balay wrote: > you can try > > make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug > MPIEXEC="mpiexec -mca orte_base_help_aggregate 0 --mca > opal_warn_on_missing_libcuda 0 -mca pml ucx --mca btl '^openib'" > > Wrt configure - it can be set with --with-mpiexec option - its saved in > PETSC_ARCH/lib/petsc/conf/petscvariables > > Satish > > On Fri, 7 Oct 2022, Rob Kudyba wrote: > > > We are on RHEL 8, using modules that we can load/unload various version > of > > packages/libraries, and I have OpenMPI 4.1.1 with CUDA aware loaded along > > with GDAL 3.3.0, GCC 10.2.0, and cmake 3.22.1 > > > > make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug check > > fails with the below errors, > > Running check examples to verify correct installation > > > > Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-c-debug > > Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process > > See https://petsc.org/release/faq/ > > > -------------------------------------------------------------------------- > > The library attempted to open the following supporting CUDA libraries, > > but each of them failed. CUDA-aware support is disabled. > > libcuda.so.1: cannot open shared object file: No such file or directory > > libcuda.dylib: cannot open shared object file: No such file or directory > > /usr/lib64/libcuda.so.1: cannot open shared object file: No such file or > > directory > > /usr/lib64/libcuda.dylib: cannot open shared object file: No such file or > > directory > > If you are not interested in CUDA-aware support, then run with > > --mca opal_warn_on_missing_libcuda 0 to suppress this message. If you > are > > interested > > in CUDA-aware support, then try setting LD_LIBRARY_PATH to the location > > of libcuda.so.1 to get passed this issue. > > > -------------------------------------------------------------------------- > > > -------------------------------------------------------------------------- > > WARNING: There was an error initializing an OpenFabrics device. > > > > Local host: g117 > > Local device: mlx5_0 > > > -------------------------------------------------------------------------- > > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > > Number of SNES iterations = 2 > > Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes > > See https://petsc.org/release/faq/ > > > > The library attempted to open the following supporting CUDA libraries, > > but each of them failed. CUDA-aware support is disabled. > > libcuda.so.1: cannot open shared object file: No such file or directory > > libcuda.dylib: cannot open shared object file: No such file or directory > > /usr/lib64/libcuda.so.1: cannot open shared object file: No such file or > > directory > > /usr/lib64/libcuda.dylib: cannot open shared object file: No such file or > > directory > > If you are not interested in CUDA-aware support, then run with > > --mca opal_warn_on_missing_libcuda 0 to suppress this message. If you > are > > interested in CUDA-aware support, then try setting LD_LIBRARY_PATH to the > > locationof libcuda.so.1 to get passed this issue. > > > > WARNING: There was an error initializing an OpenFabrics device. > > > > Local host: xxx > > Local device: mlx5_0 > > > > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > > Number of SNES iterations = 2 > > [g117:4162783] 1 more process has sent help message > > help-mpi-common-cuda.txt / dlopen failed > > [g117:4162783] Set MCA parameter "orte_base_help_aggregate" to 0 to see > all > > help / error messages > > [g117:4162783] 1 more process has sent help message > help-mpi-btl-openib.txt > > / error in device init > > Completed test examples > > Error while running make check > > gmake[1]: *** [makefile:149: check] Error 1 > > make: *** [GNUmakefile:17: check] Error 2 > > > > Where is $MPI_RUN set? I'd like to be able to pass options such as --mca > > orte_base_help_aggregate 0 --mca opal_warn_on_missing_libcuda 0 -mca pml > > ucx --mca btl '^openib' which will help me troubleshoot and hide unneeded > > warnings. > > > > Thanks, > > Rob > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Fri Oct 7 21:06:06 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Fri, 7 Oct 2022 21:06:06 -0500 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: References: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> Message-ID: On Fri, Oct 7, 2022 at 1:08 PM Rob Kudyba wrote: > Thanks for the quick reply. I added these options to make and make check > still produce the warnings so I used the command like this: > make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug > MPIEXEC="mpiexec -mca orte_base_help_aggregate 0 --mca > opal_warn_on_missing_libcuda 0 -mca pml ucx --mca btl '^openib'" check > Running check examples to verify correct installation > Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-c-debug > C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process > C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI processes > Completed test examples > > Could be useful for the FAQ. > You mentioned you had "OpenMPI 4.1.1 with CUDA aware", so I think a workable mpicc should automatically find cuda libraries. Maybe you unloaded cuda libraries? > I'm not trying to use PetSC to compile and linking appears to go awry: > [ 58%] Building CXX object > CMakeFiles/wtm.dir/src/update_effective_storativity.cpp.o > [ 62%] Linking CXX static library libwtm.a > [ 62%] Built target wtm > [ 66%] Building CXX object CMakeFiles/wtm.x.dir/src/WTM.cpp.o > [ 70%] Linking CXX executable wtm.x > /usr/bin/ld: cannot find -lpetsc > collect2: error: ld returned 1 exit status > make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 > make[1]: *** [CMakeFiles/Makefile2:269: CMakeFiles/wtm.x.dir/all] Error 2 > make: *** [Makefile:136: all] Error 2 > It seems cmake could not find petsc. Look at $PETSC_DIR/share/petsc/CMakeLists.txt and try to modify your CMakeLists.txt. > > > Is there an environment variable I'm missing? I've seen the suggestion > > to add it to LD_LIBRARY_PATH which I did with export > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$PETSC_DIR/$PETSC_ARCH/lib and that > points to: > ls -l /path/to/petsc/arch-linux-c-debug/lib > total 83732 > lrwxrwxrwx 1 rk3199 user 18 Oct 7 13:56 libpetsc.so -> > libpetsc.so.3.18.0 > lrwxrwxrwx 1 rk3199 user 18 Oct 7 13:56 libpetsc.so.3.18 -> > libpetsc.so.3.18.0 > -rwxr-xr-x 1 rk3199 user 85719200 Oct 7 13:56 libpetsc.so.3.18.0 > drwxr-xr-x 3 rk3199 user 4096 Oct 6 10:22 petsc > drwxr-xr-x 2 rk3199 user 4096 Oct 6 10:23 pkgconfig > > Anything else to check? > If modifying CMakeLists.txt does not work, you can try export LIBRARY_PATH=$LIBRARY_PATH:$PETSC_DIR/$PETSC_ARCH/lib LD_LIBRARY_PATHis is for run time, but the error happened at link time, > > On Fri, Oct 7, 2022 at 1:53 PM Satish Balay wrote: > >> you can try >> >> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug >> MPIEXEC="mpiexec -mca orte_base_help_aggregate 0 --mca >> opal_warn_on_missing_libcuda 0 -mca pml ucx --mca btl '^openib'" >> >> Wrt configure - it can be set with --with-mpiexec option - its saved in >> PETSC_ARCH/lib/petsc/conf/petscvariables >> >> Satish >> >> On Fri, 7 Oct 2022, Rob Kudyba wrote: >> >> > We are on RHEL 8, using modules that we can load/unload various version >> of >> > packages/libraries, and I have OpenMPI 4.1.1 with CUDA aware loaded >> along >> > with GDAL 3.3.0, GCC 10.2.0, and cmake 3.22.1 >> > >> > make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug check >> > fails with the below errors, >> > Running check examples to verify correct installation >> > >> > Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-c-debug >> > Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process >> > See https://petsc.org/release/faq/ >> > >> -------------------------------------------------------------------------- >> > The library attempted to open the following supporting CUDA libraries, >> > but each of them failed. CUDA-aware support is disabled. >> > libcuda.so.1: cannot open shared object file: No such file or directory >> > libcuda.dylib: cannot open shared object file: No such file or directory >> > /usr/lib64/libcuda.so.1: cannot open shared object file: No such file or >> > directory >> > /usr/lib64/libcuda.dylib: cannot open shared object file: No such file >> or >> > directory >> > If you are not interested in CUDA-aware support, then run with >> > --mca opal_warn_on_missing_libcuda 0 to suppress this message. If you >> are >> > interested >> > in CUDA-aware support, then try setting LD_LIBRARY_PATH to the location >> > of libcuda.so.1 to get passed this issue. >> > >> -------------------------------------------------------------------------- >> > >> -------------------------------------------------------------------------- >> > WARNING: There was an error initializing an OpenFabrics device. >> > >> > Local host: g117 >> > Local device: mlx5_0 >> > >> -------------------------------------------------------------------------- >> > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. >> > Number of SNES iterations = 2 >> > Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI >> processes >> > See https://petsc.org/release/faq/ >> > >> > The library attempted to open the following supporting CUDA libraries, >> > but each of them failed. CUDA-aware support is disabled. >> > libcuda.so.1: cannot open shared object file: No such file or directory >> > libcuda.dylib: cannot open shared object file: No such file or directory >> > /usr/lib64/libcuda.so.1: cannot open shared object file: No such file or >> > directory >> > /usr/lib64/libcuda.dylib: cannot open shared object file: No such file >> or >> > directory >> > If you are not interested in CUDA-aware support, then run with >> > --mca opal_warn_on_missing_libcuda 0 to suppress this message. If you >> are >> > interested in CUDA-aware support, then try setting LD_LIBRARY_PATH to >> the >> > locationof libcuda.so.1 to get passed this issue. >> > >> > WARNING: There was an error initializing an OpenFabrics device. >> > >> > Local host: xxx >> > Local device: mlx5_0 >> > >> > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. >> > Number of SNES iterations = 2 >> > [g117:4162783] 1 more process has sent help message >> > help-mpi-common-cuda.txt / dlopen failed >> > [g117:4162783] Set MCA parameter "orte_base_help_aggregate" to 0 to see >> all >> > help / error messages >> > [g117:4162783] 1 more process has sent help message >> help-mpi-btl-openib.txt >> > / error in device init >> > Completed test examples >> > Error while running make check >> > gmake[1]: *** [makefile:149: check] Error 1 >> > make: *** [GNUmakefile:17: check] Error 2 >> > >> > Where is $MPI_RUN set? I'd like to be able to pass options such as --mca >> > orte_base_help_aggregate 0 --mca opal_warn_on_missing_libcuda 0 -mca pml >> > ucx --mca btl '^openib' which will help me troubleshoot and hide >> unneeded >> > warnings. >> > >> > Thanks, >> > Rob >> > >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From rk3199 at columbia.edu Fri Oct 7 22:18:25 2022 From: rk3199 at columbia.edu (Rob Kudyba) Date: Fri, 7 Oct 2022 23:18:25 -0400 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: References: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> Message-ID: > Thanks for the quick reply. I added these options to make and make check >> still produce the warnings so I used the command like this: >> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug >> MPIEXEC="mpiexec -mca orte_base_help_aggregate 0 --mca >> opal_warn_on_missing_libcuda 0 -mca pml ucx --mca btl '^openib'" check >> Running check examples to verify correct installation >> Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-c-debug >> C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process >> C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI >> processes >> Completed test examples >> >> Could be useful for the FAQ. >> > You mentioned you had "OpenMPI 4.1.1 with CUDA aware", so I think a > workable mpicc should automatically find cuda libraries. Maybe you > unloaded cuda libraries? > Oh let me clarify, OpenMPI is CUDA aware however this code and the node where PET_Sc is compiling does not have a GPU, hence not needed and using the MPIEXEC option worked during the 'check' to suppress the warning. I'm not trying to use PetSC to compile and linking appears to go awry: >> [ 58%] Building CXX object >> CMakeFiles/wtm.dir/src/update_effective_storativity.cpp.o >> [ 62%] Linking CXX static library libwtm.a >> [ 62%] Built target wtm >> [ 66%] Building CXX object CMakeFiles/wtm.x.dir/src/WTM.cpp.o >> [ 70%] Linking CXX executable wtm.x >> /usr/bin/ld: cannot find -lpetsc >> collect2: error: ld returned 1 exit status >> make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 >> make[1]: *** [CMakeFiles/Makefile2:269: CMakeFiles/wtm.x.dir/all] Error 2 >> make: *** [Makefile:136: all] Error 2 >> > It seems cmake could not find petsc. Look > at $PETSC_DIR/share/petsc/CMakeLists.txt and try to modify your > CMakeLists.txt. > There is an explicit reference to the path in CMakeLists.txt: # NOTE: You may need to update this path to identify PETSc's location set(ENV{PKG_CONFIG_PATH} "$ENV{PKG_CONFIG_PATH}:/path/to/petsc/arch-linux-cxx-debug/lib/pkgconfig/") pkg_check_modules(PETSC PETSc>=3.17.1 IMPORTED_TARGET REQUIRED) message(STATUS "Found PETSc ${PETSC_VERSION}") add_subdirectory(common/richdem EXCLUDE_FROM_ALL) add_subdirectory(common/fmt EXCLUDE_FROM_ALL) And that exists: ls /path/to/petsc/arch-linux-cxx-debug/lib/pkgconfig/ petsc.pc PETSc.pc Is there an environment variable I'm missing? I've seen the suggestion > > to add it to LD_LIBRARY_PATH which I did with export > LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$PETSC_DIR/$PETSC_ARCH/lib and that > points to: > >> ls -l /path/to/petsc/arch-linux-c-debug/lib >> total 83732 >> lrwxrwxrwx 1 rk3199 user 18 Oct 7 13:56 libpetsc.so -> >> libpetsc.so.3.18.0 >> lrwxrwxrwx 1 rk3199 user 18 Oct 7 13:56 libpetsc.so.3.18 -> >> libpetsc.so.3.18.0 >> -rwxr-xr-x 1 rk3199 user 85719200 Oct 7 13:56 libpetsc.so.3.18.0 >> drwxr-xr-x 3 rk3199 user 4096 Oct 6 10:22 petsc >> drwxr-xr-x 2 rk3199 user 4096 Oct 6 10:23 pkgconfig >> >> Anything else to check? >> > If modifying CMakeLists.txt does not work, you can try export > LIBRARY_PATH=$LIBRARY_PATH:$PETSC_DIR/$PETSC_ARCH/lib > LD_LIBRARY_PATHis is for run time, but the error happened at link time, > Yes that's what I already had. Any other debug that I can provide? > On Fri, Oct 7, 2022 at 1:53 PM Satish Balay wrote: >> >>> you can try >>> >>> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug >>> MPIEXEC="mpiexec -mca orte_base_help_aggregate 0 --mca >>> opal_warn_on_missing_libcuda 0 -mca pml ucx --mca btl '^openib'" >>> >>> Wrt configure - it can be set with --with-mpiexec option - its saved in >>> PETSC_ARCH/lib/petsc/conf/petscvariables >>> >>> Satish >>> >>> On Fri, 7 Oct 2022, Rob Kudyba wrote: >>> >>> > We are on RHEL 8, using modules that we can load/unload various >>> version of >>> > packages/libraries, and I have OpenMPI 4.1.1 with CUDA aware loaded >>> along >>> > with GDAL 3.3.0, GCC 10.2.0, and cmake 3.22.1 >>> > >>> > make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug check >>> > fails with the below errors, >>> > Running check examples to verify correct installation >>> > >>> > Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-c-debug >>> > Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process >>> > See https://petsc.org/release/faq/ >>> > >>> -------------------------------------------------------------------------- >>> > The library attempted to open the following supporting CUDA libraries, >>> > but each of them failed. CUDA-aware support is disabled. >>> > libcuda.so.1: cannot open shared object file: No such file or directory >>> > libcuda.dylib: cannot open shared object file: No such file or >>> directory >>> > /usr/lib64/libcuda.so.1: cannot open shared object file: No such file >>> or >>> > directory >>> > /usr/lib64/libcuda.dylib: cannot open shared object file: No such file >>> or >>> > directory >>> > If you are not interested in CUDA-aware support, then run with >>> > --mca opal_warn_on_missing_libcuda 0 to suppress this message. If you >>> are >>> > interested >>> > in CUDA-aware support, then try setting LD_LIBRARY_PATH to the location >>> > of libcuda.so.1 to get passed this issue. >>> > >>> -------------------------------------------------------------------------- >>> > >>> -------------------------------------------------------------------------- >>> > WARNING: There was an error initializing an OpenFabrics device. >>> > >>> > Local host: g117 >>> > Local device: mlx5_0 >>> > >>> -------------------------------------------------------------------------- >>> > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. >>> > Number of SNES iterations = 2 >>> > Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI >>> processes >>> > See https://petsc.org/release/faq/ >>> > >>> > The library attempted to open the following supporting CUDA libraries, >>> > but each of them failed. CUDA-aware support is disabled. >>> > libcuda.so.1: cannot open shared object file: No such file or directory >>> > libcuda.dylib: cannot open shared object file: No such file or >>> directory >>> > /usr/lib64/libcuda.so.1: cannot open shared object file: No such file >>> or >>> > directory >>> > /usr/lib64/libcuda.dylib: cannot open shared object file: No such file >>> or >>> > directory >>> > If you are not interested in CUDA-aware support, then run with >>> > --mca opal_warn_on_missing_libcuda 0 to suppress this message. If you >>> are >>> > interested in CUDA-aware support, then try setting LD_LIBRARY_PATH to >>> the >>> > locationof libcuda.so.1 to get passed this issue. >>> > >>> > WARNING: There was an error initializing an OpenFabrics device. >>> > >>> > Local host: xxx >>> > Local device: mlx5_0 >>> > >>> > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. >>> > Number of SNES iterations = 2 >>> > [g117:4162783] 1 more process has sent help message >>> > help-mpi-common-cuda.txt / dlopen failed >>> > [g117:4162783] Set MCA parameter "orte_base_help_aggregate" to 0 to >>> see all >>> > help / error messages >>> > [g117:4162783] 1 more process has sent help message >>> help-mpi-btl-openib.txt >>> > / error in device init >>> > Completed test examples >>> > Error while running make check >>> > gmake[1]: *** [makefile:149: check] Error 1 >>> > make: *** [GNUmakefile:17: check] Error 2 >>> > >>> > Where is $MPI_RUN set? I'd like to be able to pass options such as >>> --mca >>> > orte_base_help_aggregate 0 --mca opal_warn_on_missing_libcuda 0 -mca >>> pml >>> > ucx --mca btl '^openib' which will help me troubleshoot and hide >>> unneeded >>> > warnings. >>> > >>> > Thanks, >>> > Rob >>> > >>> >>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From rk3199 at columbia.edu Fri Oct 7 22:45:10 2022 From: rk3199 at columbia.edu (Rob Kudyba) Date: Fri, 7 Oct 2022 23:45:10 -0400 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: References: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> Message-ID: The error changes now and at an earlier place, 66% vs 70%: make LDFLAGS="-Wl,--copy-dt-needed-entries" Consolidate compiler generated dependencies of target fmt [ 12%] Built target fmt Consolidate compiler generated dependencies of target richdem [ 37%] Built target richdem Consolidate compiler generated dependencies of target wtm [ 62%] Built target wtm Consolidate compiler generated dependencies of target wtm.x [ 66%] Linking CXX executable wtm.x /usr/bin/ld: libwtm.a(transient_groundwater.cpp.o): undefined reference to symbol 'MPI_Abort' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40: error adding symbols: DSO missing from command line collect2: error: ld returned 1 exit status make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 make[1]: *** [CMakeFiles/Makefile2:225: CMakeFiles/wtm.x.dir/all] Error 2 make: *** [Makefile:136: all] Error 2 So perhaps PET_Sc is now being found. Any other suggestions? On Fri, Oct 7, 2022 at 11:18 PM Rob Kudyba wrote: > > Thanks for the quick reply. I added these options to make and make check >>> still produce the warnings so I used the command like this: >>> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug >>> MPIEXEC="mpiexec -mca orte_base_help_aggregate 0 --mca >>> opal_warn_on_missing_libcuda 0 -mca pml ucx --mca btl '^openib'" check >>> Running check examples to verify correct installation >>> Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-c-debug >>> C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process >>> C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI >>> processes >>> Completed test examples >>> >>> Could be useful for the FAQ. >>> >> You mentioned you had "OpenMPI 4.1.1 with CUDA aware", so I think a >> workable mpicc should automatically find cuda libraries. Maybe you >> unloaded cuda libraries? >> > Oh let me clarify, OpenMPI is CUDA aware however this code and the node > where PET_Sc is compiling does not have a GPU, hence not needed and using > the MPIEXEC option worked during the 'check' to suppress the warning. > > I'm not trying to use PetSC to compile and linking appears to go awry: >>> [ 58%] Building CXX object >>> CMakeFiles/wtm.dir/src/update_effective_storativity.cpp.o >>> [ 62%] Linking CXX static library libwtm.a >>> [ 62%] Built target wtm >>> [ 66%] Building CXX object CMakeFiles/wtm.x.dir/src/WTM.cpp.o >>> [ 70%] Linking CXX executable wtm.x >>> /usr/bin/ld: cannot find -lpetsc >>> collect2: error: ld returned 1 exit status >>> make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 >>> make[1]: *** [CMakeFiles/Makefile2:269: CMakeFiles/wtm.x.dir/all] Error 2 >>> make: *** [Makefile:136: all] Error 2 >>> >> It seems cmake could not find petsc. Look >> at $PETSC_DIR/share/petsc/CMakeLists.txt and try to modify your >> CMakeLists.txt. >> > > There is an explicit reference to the path in CMakeLists.txt: > # NOTE: You may need to update this path to identify PETSc's location > set(ENV{PKG_CONFIG_PATH} > "$ENV{PKG_CONFIG_PATH}:/path/to/petsc/arch-linux-cxx-debug/lib/pkgconfig/") > pkg_check_modules(PETSC PETSc>=3.17.1 IMPORTED_TARGET REQUIRED) > message(STATUS "Found PETSc ${PETSC_VERSION}") > add_subdirectory(common/richdem EXCLUDE_FROM_ALL) > add_subdirectory(common/fmt EXCLUDE_FROM_ALL) > > And that exists: > ls /path/to/petsc/arch-linux-cxx-debug/lib/pkgconfig/ > petsc.pc PETSc.pc > > Is there an environment variable I'm missing? I've seen the suggestion >> >> to add it to LD_LIBRARY_PATH which I did with export >> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$PETSC_DIR/$PETSC_ARCH/lib and that >> points to: >> >>> ls -l /path/to/petsc/arch-linux-c-debug/lib >>> total 83732 >>> lrwxrwxrwx 1 rk3199 user 18 Oct 7 13:56 libpetsc.so -> >>> libpetsc.so.3.18.0 >>> lrwxrwxrwx 1 rk3199 user 18 Oct 7 13:56 libpetsc.so.3.18 -> >>> libpetsc.so.3.18.0 >>> -rwxr-xr-x 1 rk3199 user 85719200 Oct 7 13:56 libpetsc.so.3.18.0 >>> drwxr-xr-x 3 rk3199 user 4096 Oct 6 10:22 petsc >>> drwxr-xr-x 2 rk3199 user 4096 Oct 6 10:23 pkgconfig >>> >>> Anything else to check? >>> >> If modifying CMakeLists.txt does not work, you can try export >> LIBRARY_PATH=$LIBRARY_PATH:$PETSC_DIR/$PETSC_ARCH/lib >> LD_LIBRARY_PATHis is for run time, but the error happened at link time, >> > > Yes that's what I already had. Any other debug that I can provide? > > > >> On Fri, Oct 7, 2022 at 1:53 PM Satish Balay wrote: >>> >>>> you can try >>>> >>>> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug >>>> MPIEXEC="mpiexec -mca orte_base_help_aggregate 0 --mca >>>> opal_warn_on_missing_libcuda 0 -mca pml ucx --mca btl '^openib'" >>>> >>>> Wrt configure - it can be set with --with-mpiexec option - its saved in >>>> PETSC_ARCH/lib/petsc/conf/petscvariables >>>> >>>> Satish >>>> >>>> On Fri, 7 Oct 2022, Rob Kudyba wrote: >>>> >>>> > We are on RHEL 8, using modules that we can load/unload various >>>> version of >>>> > packages/libraries, and I have OpenMPI 4.1.1 with CUDA aware loaded >>>> along >>>> > with GDAL 3.3.0, GCC 10.2.0, and cmake 3.22.1 >>>> > >>>> > make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug check >>>> > fails with the below errors, >>>> > Running check examples to verify correct installation >>>> > >>>> > Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-c-debug >>>> > Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI >>>> process >>>> > See https://petsc.org/release/faq/ >>>> > >>>> -------------------------------------------------------------------------- >>>> > The library attempted to open the following supporting CUDA libraries, >>>> > but each of them failed. CUDA-aware support is disabled. >>>> > libcuda.so.1: cannot open shared object file: No such file or >>>> directory >>>> > libcuda.dylib: cannot open shared object file: No such file or >>>> directory >>>> > /usr/lib64/libcuda.so.1: cannot open shared object file: No such file >>>> or >>>> > directory >>>> > /usr/lib64/libcuda.dylib: cannot open shared object file: No such >>>> file or >>>> > directory >>>> > If you are not interested in CUDA-aware support, then run with >>>> > --mca opal_warn_on_missing_libcuda 0 to suppress this message. If >>>> you are >>>> > interested >>>> > in CUDA-aware support, then try setting LD_LIBRARY_PATH to the >>>> location >>>> > of libcuda.so.1 to get passed this issue. >>>> > >>>> -------------------------------------------------------------------------- >>>> > >>>> -------------------------------------------------------------------------- >>>> > WARNING: There was an error initializing an OpenFabrics device. >>>> > >>>> > Local host: g117 >>>> > Local device: mlx5_0 >>>> > >>>> -------------------------------------------------------------------------- >>>> > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. >>>> > Number of SNES iterations = 2 >>>> > Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI >>>> processes >>>> > See https://petsc.org/release/faq/ >>>> > >>>> > The library attempted to open the following supporting CUDA libraries, >>>> > but each of them failed. CUDA-aware support is disabled. >>>> > libcuda.so.1: cannot open shared object file: No such file or >>>> directory >>>> > libcuda.dylib: cannot open shared object file: No such file or >>>> directory >>>> > /usr/lib64/libcuda.so.1: cannot open shared object file: No such file >>>> or >>>> > directory >>>> > /usr/lib64/libcuda.dylib: cannot open shared object file: No such >>>> file or >>>> > directory >>>> > If you are not interested in CUDA-aware support, then run with >>>> > --mca opal_warn_on_missing_libcuda 0 to suppress this message. If >>>> you are >>>> > interested in CUDA-aware support, then try setting LD_LIBRARY_PATH to >>>> the >>>> > locationof libcuda.so.1 to get passed this issue. >>>> > >>>> > WARNING: There was an error initializing an OpenFabrics device. >>>> > >>>> > Local host: xxx >>>> > Local device: mlx5_0 >>>> > >>>> > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. >>>> > Number of SNES iterations = 2 >>>> > [g117:4162783] 1 more process has sent help message >>>> > help-mpi-common-cuda.txt / dlopen failed >>>> > [g117:4162783] Set MCA parameter "orte_base_help_aggregate" to 0 to >>>> see all >>>> > help / error messages >>>> > [g117:4162783] 1 more process has sent help message >>>> help-mpi-btl-openib.txt >>>> > / error in device init >>>> > Completed test examples >>>> > Error while running make check >>>> > gmake[1]: *** [makefile:149: check] Error 1 >>>> > make: *** [GNUmakefile:17: check] Error 2 >>>> > >>>> > Where is $MPI_RUN set? I'd like to be able to pass options such as >>>> --mca >>>> > orte_base_help_aggregate 0 --mca opal_warn_on_missing_libcuda 0 -mca >>>> pml >>>> > ucx --mca btl '^openib' which will help me troubleshoot and hide >>>> unneeded >>>> > warnings. >>>> > >>>> > Thanks, >>>> > Rob >>>> > >>>> >>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Sat Oct 8 14:19:48 2022 From: bsmith at petsc.dev (Barry Smith) Date: Sat, 8 Oct 2022 15:19:48 -0400 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: References: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> Message-ID: <16EE4635-0A03-45AA-92AD-1926907F4B8E@petsc.dev> I hate these kinds of make rules that hide what the compiler is doing (in the name of having less output, I guess) it makes it difficult to figure out what is going wrong. Anyways, either some of the MPI libraries are missing from the link line or they are in the wrong order and thus it is not able to search them properly. Here is a bunch of discussions on why that error message can appear https://stackoverflow.com/questions/19901934/libpthread-so-0-error-adding-symbols-dso-missing-from-command-line Barry > On Oct 7, 2022, at 11:45 PM, Rob Kudyba wrote: > > The error changes now and at an earlier place, 66% vs 70%: > make LDFLAGS="-Wl,--copy-dt-needed-entries" > Consolidate compiler generated dependencies of target fmt > [ 12%] Built target fmt > Consolidate compiler generated dependencies of target richdem > [ 37%] Built target richdem > Consolidate compiler generated dependencies of target wtm > [ 62%] Built target wtm > Consolidate compiler generated dependencies of target wtm.x > [ 66%] Linking CXX executable wtm.x > /usr/bin/ld: libwtm.a(transient_groundwater.cpp.o): undefined reference to symbol 'MPI_Abort' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40: error adding symbols: DSO missing from command line > collect2: error: ld returned 1 exit status > make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 > make[1]: *** [CMakeFiles/Makefile2:225: CMakeFiles/wtm.x.dir/all] Error 2 > make: *** [Makefile:136: all] Error 2 > > So perhaps PET_Sc is now being found. Any other suggestions? > > On Fri, Oct 7, 2022 at 11:18 PM Rob Kudyba > wrote: > > Thanks for the quick reply. I added these options to make and make check still produce the warnings so I used the command like this: > make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug MPIEXEC="mpiexec -mca orte_base_help_aggregate 0 --mca opal_warn_on_missing_libcuda 0 -mca pml ucx --mca btl '^openib'" check > Running check examples to verify correct installation > Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-c-debug > C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process > C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI processes > Completed test examples > > Could be useful for the FAQ. > You mentioned you had "OpenMPI 4.1.1 with CUDA aware", so I think a workable mpicc should automatically find cuda libraries. Maybe you unloaded cuda libraries? > Oh let me clarify, OpenMPI is CUDA aware however this code and the node where PET_Sc is compiling does not have a GPU, hence not needed and using the MPIEXEC option worked during the 'check' to suppress the warning. > > I'm not trying to use PetSC to compile and linking appears to go awry: > [ 58%] Building CXX object CMakeFiles/wtm.dir/src/update_effective_storativity.cpp.o > [ 62%] Linking CXX static library libwtm.a > [ 62%] Built target wtm > [ 66%] Building CXX object CMakeFiles/wtm.x.dir/src/WTM.cpp.o > [ 70%] Linking CXX executable wtm.x > /usr/bin/ld: cannot find -lpetsc > collect2: error: ld returned 1 exit status > make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 > make[1]: *** [CMakeFiles/Makefile2:269: CMakeFiles/wtm.x.dir/all] Error 2 > make: *** [Makefile:136: all] Error 2 > It seems cmake could not find petsc. Look at $PETSC_DIR/share/petsc/CMakeLists.txt and try to modify your CMakeLists.txt. > > There is an explicit reference to the path in CMakeLists.txt: > # NOTE: You may need to update this path to identify PETSc's location > set(ENV{PKG_CONFIG_PATH} "$ENV{PKG_CONFIG_PATH}:/path/to/petsc/arch-linux-cxx-debug/lib/pkgconfig/") > pkg_check_modules(PETSC PETSc>=3.17.1 IMPORTED_TARGET REQUIRED) > message(STATUS "Found PETSc ${PETSC_VERSION}") > add_subdirectory(common/richdem EXCLUDE_FROM_ALL) > add_subdirectory(common/fmt EXCLUDE_FROM_ALL) > > And that exists: > ls /path/to/petsc/arch-linux-cxx-debug/lib/pkgconfig/ > petsc.pc PETSc.pc > > Is there an environment variable I'm missing? I've seen the suggestion to add it to LD_LIBRARY_PATH which I did with export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$PETSC_DIR/$PETSC_ARCH/lib and that points to: > ls -l /path/to/petsc/arch-linux-c-debug/lib > total 83732 > lrwxrwxrwx 1 rk3199 user 18 Oct 7 13:56 libpetsc.so -> libpetsc.so.3.18.0 > lrwxrwxrwx 1 rk3199 user 18 Oct 7 13:56 libpetsc.so.3.18 -> libpetsc.so.3.18.0 > -rwxr-xr-x 1 rk3199 user 85719200 Oct 7 13:56 libpetsc.so.3.18.0 > drwxr-xr-x 3 rk3199 user 4096 Oct 6 10:22 petsc > drwxr-xr-x 2 rk3199 user 4096 Oct 6 10:23 pkgconfig > > Anything else to check? > If modifying CMakeLists.txt does not work, you can try export LIBRARY_PATH=$LIBRARY_PATH:$PETSC_DIR/$PETSC_ARCH/lib > LD_LIBRARY_PATHis is for run time, but the error happened at link time, > > Yes that's what I already had. Any other debug that I can provide? > > > On Fri, Oct 7, 2022 at 1:53 PM Satish Balay > wrote: > you can try > > make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug MPIEXEC="mpiexec -mca orte_base_help_aggregate 0 --mca opal_warn_on_missing_libcuda 0 -mca pml ucx --mca btl '^openib'" > > Wrt configure - it can be set with --with-mpiexec option - its saved in PETSC_ARCH/lib/petsc/conf/petscvariables > > Satish > > On Fri, 7 Oct 2022, Rob Kudyba wrote: > > > We are on RHEL 8, using modules that we can load/unload various version of > > packages/libraries, and I have OpenMPI 4.1.1 with CUDA aware loaded along > > with GDAL 3.3.0, GCC 10.2.0, and cmake 3.22.1 > > > > make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug check > > fails with the below errors, > > Running check examples to verify correct installation > > > > Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-c-debug > > Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process > > See https://petsc.org/release/faq/ > > -------------------------------------------------------------------------- > > The library attempted to open the following supporting CUDA libraries, > > but each of them failed. CUDA-aware support is disabled. > > libcuda.so.1: cannot open shared object file: No such file or directory > > libcuda.dylib: cannot open shared object file: No such file or directory > > /usr/lib64/libcuda.so.1: cannot open shared object file: No such file or > > directory > > /usr/lib64/libcuda.dylib: cannot open shared object file: No such file or > > directory > > If you are not interested in CUDA-aware support, then run with > > --mca opal_warn_on_missing_libcuda 0 to suppress this message. If you are > > interested > > in CUDA-aware support, then try setting LD_LIBRARY_PATH to the location > > of libcuda.so.1 to get passed this issue. > > -------------------------------------------------------------------------- > > -------------------------------------------------------------------------- > > WARNING: There was an error initializing an OpenFabrics device. > > > > Local host: g117 > > Local device: mlx5_0 > > -------------------------------------------------------------------------- > > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > > Number of SNES iterations = 2 > > Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes > > See https://petsc.org/release/faq/ > > > > The library attempted to open the following supporting CUDA libraries, > > but each of them failed. CUDA-aware support is disabled. > > libcuda.so.1: cannot open shared object file: No such file or directory > > libcuda.dylib: cannot open shared object file: No such file or directory > > /usr/lib64/libcuda.so.1: cannot open shared object file: No such file or > > directory > > /usr/lib64/libcuda.dylib: cannot open shared object file: No such file or > > directory > > If you are not interested in CUDA-aware support, then run with > > --mca opal_warn_on_missing_libcuda 0 to suppress this message. If you are > > interested in CUDA-aware support, then try setting LD_LIBRARY_PATH to the > > locationof libcuda.so.1 to get passed this issue. > > > > WARNING: There was an error initializing an OpenFabrics device. > > > > Local host: xxx > > Local device: mlx5_0 > > > > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > > Number of SNES iterations = 2 > > [g117:4162783] 1 more process has sent help message > > help-mpi-common-cuda.txt / dlopen failed > > [g117:4162783] Set MCA parameter "orte_base_help_aggregate" to 0 to see all > > help / error messages > > [g117:4162783] 1 more process has sent help message help-mpi-btl-openib.txt > > / error in device init > > Completed test examples > > Error while running make check > > gmake[1]: *** [makefile:149: check] Error 1 > > make: *** [GNUmakefile:17: check] Error 2 > > > > Where is $MPI_RUN set? I'd like to be able to pass options such as --mca > > orte_base_help_aggregate 0 --mca opal_warn_on_missing_libcuda 0 -mca pml > > ucx --mca btl '^openib' which will help me troubleshoot and hide unneeded > > warnings. > > > > Thanks, > > Rob > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Sat Oct 8 14:56:33 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Sat, 8 Oct 2022 14:56:33 -0500 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: References: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> Message-ID: Perhaps we can back one step: Use your mpicc to build a "hello world" mpi test, then run it on a compute node (with GPU) to see if it works. If no, then your MPI environment has problems; If yes, then use it to build petsc (turn on petsc's gpu support, --with-cuda --with-cudac=nvcc), and then your code. --Junchao Zhang On Fri, Oct 7, 2022 at 10:45 PM Rob Kudyba wrote: > The error changes now and at an earlier place, 66% vs 70%: > make LDFLAGS="-Wl,--copy-dt-needed-entries" > Consolidate compiler generated dependencies of target fmt > [ 12%] Built target fmt > Consolidate compiler generated dependencies of target richdem > [ 37%] Built target richdem > Consolidate compiler generated dependencies of target wtm > [ 62%] Built target wtm > Consolidate compiler generated dependencies of target wtm.x > [ 66%] Linking CXX executable wtm.x > /usr/bin/ld: libwtm.a(transient_groundwater.cpp.o): undefined reference to > symbol 'MPI_Abort' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40: error > adding symbols: DSO missing from command line > collect2: error: ld returned 1 exit status > make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 > make[1]: *** [CMakeFiles/Makefile2:225: CMakeFiles/wtm.x.dir/all] Error 2 > make: *** [Makefile:136: all] Error 2 > > So perhaps PET_Sc is now being found. Any other suggestions? > > On Fri, Oct 7, 2022 at 11:18 PM Rob Kudyba wrote: > >> >> Thanks for the quick reply. I added these options to make and make check >>>> still produce the warnings so I used the command like this: >>>> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug >>>> MPIEXEC="mpiexec -mca orte_base_help_aggregate 0 --mca >>>> opal_warn_on_missing_libcuda 0 -mca pml ucx --mca btl '^openib'" check >>>> Running check examples to verify correct installation >>>> Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-c-debug >>>> C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI >>>> process >>>> C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI >>>> processes >>>> Completed test examples >>>> >>>> Could be useful for the FAQ. >>>> >>> You mentioned you had "OpenMPI 4.1.1 with CUDA aware", so I think a >>> workable mpicc should automatically find cuda libraries. Maybe you >>> unloaded cuda libraries? >>> >> Oh let me clarify, OpenMPI is CUDA aware however this code and the node >> where PET_Sc is compiling does not have a GPU, hence not needed and using >> the MPIEXEC option worked during the 'check' to suppress the warning. >> >> I'm not trying to use PetSC to compile and linking appears to go awry: >>>> [ 58%] Building CXX object >>>> CMakeFiles/wtm.dir/src/update_effective_storativity.cpp.o >>>> [ 62%] Linking CXX static library libwtm.a >>>> [ 62%] Built target wtm >>>> [ 66%] Building CXX object CMakeFiles/wtm.x.dir/src/WTM.cpp.o >>>> [ 70%] Linking CXX executable wtm.x >>>> /usr/bin/ld: cannot find -lpetsc >>>> collect2: error: ld returned 1 exit status >>>> make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 >>>> make[1]: *** [CMakeFiles/Makefile2:269: CMakeFiles/wtm.x.dir/all] Error >>>> 2 >>>> make: *** [Makefile:136: all] Error 2 >>>> >>> It seems cmake could not find petsc. Look >>> at $PETSC_DIR/share/petsc/CMakeLists.txt and try to modify your >>> CMakeLists.txt. >>> >> >> There is an explicit reference to the path in CMakeLists.txt: >> # NOTE: You may need to update this path to identify PETSc's location >> set(ENV{PKG_CONFIG_PATH} >> "$ENV{PKG_CONFIG_PATH}:/path/to/petsc/arch-linux-cxx-debug/lib/pkgconfig/") >> pkg_check_modules(PETSC PETSc>=3.17.1 IMPORTED_TARGET REQUIRED) >> message(STATUS "Found PETSc ${PETSC_VERSION}") >> add_subdirectory(common/richdem EXCLUDE_FROM_ALL) >> add_subdirectory(common/fmt EXCLUDE_FROM_ALL) >> >> And that exists: >> ls /path/to/petsc/arch-linux-cxx-debug/lib/pkgconfig/ >> petsc.pc PETSc.pc >> >> Is there an environment variable I'm missing? I've seen the suggestion >>> >>> to add it to LD_LIBRARY_PATH which I did with export >>> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$PETSC_DIR/$PETSC_ARCH/lib and that >>> points to: >>> >>>> ls -l /path/to/petsc/arch-linux-c-debug/lib >>>> total 83732 >>>> lrwxrwxrwx 1 rk3199 user 18 Oct 7 13:56 libpetsc.so -> >>>> libpetsc.so.3.18.0 >>>> lrwxrwxrwx 1 rk3199 user 18 Oct 7 13:56 libpetsc.so.3.18 -> >>>> libpetsc.so.3.18.0 >>>> -rwxr-xr-x 1 rk3199 user 85719200 Oct 7 13:56 libpetsc.so.3.18.0 >>>> drwxr-xr-x 3 rk3199 user 4096 Oct 6 10:22 petsc >>>> drwxr-xr-x 2 rk3199 user 4096 Oct 6 10:23 pkgconfig >>>> >>>> Anything else to check? >>>> >>> If modifying CMakeLists.txt does not work, you can try export >>> LIBRARY_PATH=$LIBRARY_PATH:$PETSC_DIR/$PETSC_ARCH/lib >>> LD_LIBRARY_PATHis is for run time, but the error happened at link time, >>> >> >> Yes that's what I already had. Any other debug that I can provide? >> >> >> >>> On Fri, Oct 7, 2022 at 1:53 PM Satish Balay wrote: >>>> >>>>> you can try >>>>> >>>>> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug >>>>> MPIEXEC="mpiexec -mca orte_base_help_aggregate 0 --mca >>>>> opal_warn_on_missing_libcuda 0 -mca pml ucx --mca btl '^openib'" >>>>> >>>>> Wrt configure - it can be set with --with-mpiexec option - its saved >>>>> in PETSC_ARCH/lib/petsc/conf/petscvariables >>>>> >>>>> Satish >>>>> >>>>> On Fri, 7 Oct 2022, Rob Kudyba wrote: >>>>> >>>>> > We are on RHEL 8, using modules that we can load/unload various >>>>> version of >>>>> > packages/libraries, and I have OpenMPI 4.1.1 with CUDA aware loaded >>>>> along >>>>> > with GDAL 3.3.0, GCC 10.2.0, and cmake 3.22.1 >>>>> > >>>>> > make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-debug check >>>>> > fails with the below errors, >>>>> > Running check examples to verify correct installation >>>>> > >>>>> > Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-c-debug >>>>> > Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI >>>>> process >>>>> > See https://petsc.org/release/faq/ >>>>> > >>>>> -------------------------------------------------------------------------- >>>>> > The library attempted to open the following supporting CUDA >>>>> libraries, >>>>> > but each of them failed. CUDA-aware support is disabled. >>>>> > libcuda.so.1: cannot open shared object file: No such file or >>>>> directory >>>>> > libcuda.dylib: cannot open shared object file: No such file or >>>>> directory >>>>> > /usr/lib64/libcuda.so.1: cannot open shared object file: No such >>>>> file or >>>>> > directory >>>>> > /usr/lib64/libcuda.dylib: cannot open shared object file: No such >>>>> file or >>>>> > directory >>>>> > If you are not interested in CUDA-aware support, then run with >>>>> > --mca opal_warn_on_missing_libcuda 0 to suppress this message. If >>>>> you are >>>>> > interested >>>>> > in CUDA-aware support, then try setting LD_LIBRARY_PATH to the >>>>> location >>>>> > of libcuda.so.1 to get passed this issue. >>>>> > >>>>> -------------------------------------------------------------------------- >>>>> > >>>>> -------------------------------------------------------------------------- >>>>> > WARNING: There was an error initializing an OpenFabrics device. >>>>> > >>>>> > Local host: g117 >>>>> > Local device: mlx5_0 >>>>> > >>>>> -------------------------------------------------------------------------- >>>>> > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. >>>>> > Number of SNES iterations = 2 >>>>> > Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI >>>>> processes >>>>> > See https://petsc.org/release/faq/ >>>>> > >>>>> > The library attempted to open the following supporting CUDA >>>>> libraries, >>>>> > but each of them failed. CUDA-aware support is disabled. >>>>> > libcuda.so.1: cannot open shared object file: No such file or >>>>> directory >>>>> > libcuda.dylib: cannot open shared object file: No such file or >>>>> directory >>>>> > /usr/lib64/libcuda.so.1: cannot open shared object file: No such >>>>> file or >>>>> > directory >>>>> > /usr/lib64/libcuda.dylib: cannot open shared object file: No such >>>>> file or >>>>> > directory >>>>> > If you are not interested in CUDA-aware support, then run with >>>>> > --mca opal_warn_on_missing_libcuda 0 to suppress this message. If >>>>> you are >>>>> > interested in CUDA-aware support, then try setting LD_LIBRARY_PATH >>>>> to the >>>>> > locationof libcuda.so.1 to get passed this issue. >>>>> > >>>>> > WARNING: There was an error initializing an OpenFabrics device. >>>>> > >>>>> > Local host: xxx >>>>> > Local device: mlx5_0 >>>>> > >>>>> > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. >>>>> > Number of SNES iterations = 2 >>>>> > [g117:4162783] 1 more process has sent help message >>>>> > help-mpi-common-cuda.txt / dlopen failed >>>>> > [g117:4162783] Set MCA parameter "orte_base_help_aggregate" to 0 to >>>>> see all >>>>> > help / error messages >>>>> > [g117:4162783] 1 more process has sent help message >>>>> help-mpi-btl-openib.txt >>>>> > / error in device init >>>>> > Completed test examples >>>>> > Error while running make check >>>>> > gmake[1]: *** [makefile:149: check] Error 1 >>>>> > make: *** [GNUmakefile:17: check] Error 2 >>>>> > >>>>> > Where is $MPI_RUN set? I'd like to be able to pass options such as >>>>> --mca >>>>> > orte_base_help_aggregate 0 --mca opal_warn_on_missing_libcuda 0 -mca >>>>> pml >>>>> > ucx --mca btl '^openib' which will help me troubleshoot and hide >>>>> unneeded >>>>> > warnings. >>>>> > >>>>> > Thanks, >>>>> > Rob >>>>> > >>>>> >>>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Sat Oct 8 17:48:42 2022 From: jed at jedbrown.org (Jed Brown) Date: Sat, 08 Oct 2022 16:48:42 -0600 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: <16EE4635-0A03-45AA-92AD-1926907F4B8E@petsc.dev> References: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> <16EE4635-0A03-45AA-92AD-1926907F4B8E@petsc.dev> Message-ID: <878rlqx779.fsf@jedbrown.org> Barry Smith writes: > I hate these kinds of make rules that hide what the compiler is doing (in the name of having less output, I guess) it makes it difficult to figure out what is going wrong. You can make VERBOSE=1 with CMake-generated makefiles. From bsmith at petsc.dev Sat Oct 8 18:56:31 2022 From: bsmith at petsc.dev (Barry Smith) Date: Sat, 8 Oct 2022 19:56:31 -0400 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: <878rlqx779.fsf@jedbrown.org> References: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> <16EE4635-0A03-45AA-92AD-1926907F4B8E@petsc.dev> <878rlqx779.fsf@jedbrown.org> Message-ID: <599F31B3-BE61-4928-871F-D773289D5497@petsc.dev> True, but when users send reports back to us they will never have used the VERBOSE=1 option, so it requires one more round trip of email to get this additional information. > On Oct 8, 2022, at 6:48 PM, Jed Brown wrote: > > Barry Smith writes: > >> I hate these kinds of make rules that hide what the compiler is doing (in the name of having less output, I guess) it makes it difficult to figure out what is going wrong. > > You can make VERBOSE=1 with CMake-generated makefiles. From rk3199 at columbia.edu Sat Oct 8 21:31:48 2022 From: rk3199 at columbia.edu (Rob Kudyba) Date: Sat, 8 Oct 2022 22:31:48 -0400 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: <599F31B3-BE61-4928-871F-D773289D5497@petsc.dev> References: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> <16EE4635-0A03-45AA-92AD-1926907F4B8E@petsc.dev> <878rlqx779.fsf@jedbrown.org> <599F31B3-BE61-4928-871F-D773289D5497@petsc.dev> Message-ID: > > Perhaps we can back one step: > Use your mpicc to build a "hello world" mpi test, then run it on a compute > node (with GPU) to see if it works. > If no, then your MPI environment has problems; > If yes, then use it to build petsc (turn on petsc's gpu support, > --with-cuda --with-cudac=nvcc), and then your code. > --Junchao Zhang OK tried this just to eliminate that the CUDA-capable OpenMPI is a factor: ./configure --with-debugging=0 --with-cmake=true --with-mpi=true --with-mpi-dir=/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support --with-fc=0 --with-cuda=1 [..] cuda: Version: 11.7 Includes: -I/path/to/cuda11.7/toolkit/11.7.1/include Libraries: -Wl,-rpath,/path/to/cuda11.7/toolkit/11.7.1/lib64 -L/cm/shared/apps/cuda11.7/toolkit/11.7.1/lib64 -L/path/to/cuda11.7/toolkit/11.7.1/lib64/stubs -lcudart -lnvToolsExt -lcufft -lcublas -lcusparse -lcusolver -lcurand -lcuda CUDA SM 75 CUDA underlying compiler: CUDA_CXX="/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/bin"/mpicxx CUDA underlying compiler flags: CUDA_CXXFLAGS= CUDA underlying linker libraries: CUDA_CXXLIBS= [...] Configure stage complete. Now build PETSc libraries with: make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-opt all C++ compiler version: g++ (GCC) 10.2.0 Using C++ compiler to compile PETSc ----------------------------------------- Using C/C++ linker: /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/bin/mpicxx Using C/C++ flags: -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -Wno-lto-type-mismatch -fstack-protector -fvisibility=hidden -g -O0 ----------------------------------------- Using system modules: shared:slurm/20.02.6:DefaultModules:openmpi/gcc/64/4.1.1_cuda_11.0.3_aware:gdal/3.3.0:cmake/3.22.1:cuda11.7/toolkit/11.7.1:openblas/dynamic/0.3.7:gcc/10.2.0 Using mpi.h: # 1 "/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include/mpi.h" 1 ----------------------------------------- Using libraries: -Wl,-rpath,/path/to/petsc/arch-linux-cxx-debug/lib -L/path/to/petsc/arch-linux-cxx-debug/lib -lpetsc -lopenblas -lm -lX11 -lquadmath -lstdc++ -ldl ------------------------------------------ Using mpiexec: mpiexec -mca orte_base_help_aggregate 0 -mca pml ucx --mca btl '^openib' ------------------------------------------ Using MAKE: /path/to/petsc/arch-linux-cxx-debug/bin/make Using MAKEFLAGS: -j24 -l48.0 --no-print-directory -- MPIEXEC=mpiexec\ -mca\ orte_base_help_aggregate\ 0\ \ -mca\ pml\ ucx\ --mca\ btl\ '^openib' PETSC_ARCH=arch-linux-cxx-debug PETSC_DIR=/path/to/petsc ========================================== make[3]: Nothing to be done for 'libs'. ========================================= Now to check if the libraries are working do: make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-cxx-debug check ========================================= [me at xxx petsc]$ make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-cxx-debug MPIEXEC="mpiexec -mca orte_base_help_aggregate 0 -mca pml ucx --mca btl '^openib'" check Running check examples to verify correct installation Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-cxx-debug C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI processes ./bandwidthTest [CUDA Bandwidth Test] - Starting... Running on... Device 0: Quadro RTX 8000 Quick Mode Host to Device Bandwidth, 1 Device(s) PINNED Memory Transfers Transfer Size (Bytes) Bandwidth(GB/s) 32000000 12.3 Device to Host Bandwidth, 1 Device(s) PINNED Memory Transfers Transfer Size (Bytes) Bandwidth(GB/s) 32000000 13.2 Device to Device Bandwidth, 1 Device(s) PINNED Memory Transfers Transfer Size (Bytes) Bandwidth(GB/s) 32000000 466.2 Result = PASS On Sat, Oct 8, 2022 at 7:56 PM Barry Smith wrote: > > True, but when users send reports back to us they will never have used > the VERBOSE=1 option, so it requires one more round trip of email to get > this additional information. > > > On Oct 8, 2022, at 6:48 PM, Jed Brown wrote: > > > > Barry Smith writes: > > > >> I hate these kinds of make rules that hide what the compiler is doing > (in the name of having less output, I guess) it makes it difficult to > figure out what is going wrong. > > > > You can make VERBOSE=1 with CMake-generated makefiles. > > Anyways, either some of the MPI libraries are missing from the link line > or they are in the wrong order and thus it is not able to search them > properly. Here is a bunch of discussions on why that error message can > appear > https://stackoverflow.com/questions/19901934/libpthread-so-0-error-adding-symbols-dso-missing-from-command-line > Still same but more noise and I have been using the suggestion of LDFLAGS="-Wl,--copy-dt-needed-entries" along with make: make[2]: Entering directory '/path/to/WTM/build' cd /path/to/WTM/build && /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_depends "Unix Makefiles" /path/to/WTM /path/to/WTM /path/to/WTM/build /path/to/WTM/build /path/to/WTM/build/CMakeFiles/wtm.x.dir/DependInfo.cmake --color= make[2]: Leaving directory '/path/to/WTM/build' make -f CMakeFiles/wtm.x.dir/build.make CMakeFiles/wtm.x.dir/build make[2]: Entering directory '/path/to/WTM/build' [ 66%] Building CXX object CMakeFiles/wtm.x.dir/src/WTM.cpp.o /cm/local/apps/gcc/10.2.0/bin/c++ -I/path/to/WTM/common/richdem/include -I/path/to/gdal-3.3.0/include -I/path/to/WTM/common/fmt/include -isystem /path/to/petsc/arch-linux-cxx-debug/include -isystem /path/to/petsc/include -isystem -O3 -g -Wall -Wextra -pedantic -Wshadow -Wfloat-conversion -Wall -Wextra -pedantic -Wshadow -DRICHDEM_GIT_HASH=\"xxx\" -DRICHDEM_COMPILE_TIME=\"2022-10-09T02:21:11Z\" -DUSEGDAL -Xpreprocessor -fopenmp /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++2a -MD -MT CMakeFiles/wtm.x.dir/src/WTM.cpp.o -MF CMakeFiles/wtm.x.dir/src/WTM.cpp.o.d -o CMakeFiles/wtm.x.dir/src/WTM.cpp.o -c /path/to/WTM/src/WTM.cpp c++: warning: /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: linker input file unused because linking not done [ 70%] Linking CXX executable wtm.x /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_link_script CMakeFiles/wtm.x.dir/link.txt --verbose=1 /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib:/path/to/petsc/arch-linux-cxx-debug/lib libwtm.a common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so /usr/bin/ld: CMakeFiles/wtm.x.dir/src/WTM.cpp.o: undefined reference to symbol 'ompi_mpi_comm_self' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40: error adding symbols: DSO missing from command line collect2: error: ld returned 1 exit status make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 make[2]: Leaving directory '/path/to/WTM/build' make[1]: *** [CMakeFiles/Makefile2:225: CMakeFiles/wtm.x.dir/all] Error 2 make[1]: Leaving directory '/path/to/WTM/build' make: *** [Makefile:136: all] Error 2 Anything stick out? -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Oct 9 07:11:35 2022 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 9 Oct 2022 13:11:35 +0100 Subject: [petsc-users] Slepc, shell matrix, parallel, halo exchange In-Reply-To: References: <53363D7B-CCBD-4DAB-924E-1D5D56975828@dsic.upv.es> <76162134-CDE9-42B9-8310-D9DD33D2F12D@dsic.upv.es> Message-ID: On Fri, Oct 7, 2022 at 5:48 PM feng wang wrote: > Hi Mat, > > I've tried the suggested approach. The halo cells are not exchanged > somehow. Below is how I do it, have I missed anything? > > I create a ghost vector *petsc_dcsv* and it is a data member of the class > cFdDomain, which is a context of the shell matrix. > > * PetscCall(VecCreateGhostBlock(*A_COMM_WORLD, blocksize, > blocksize*nlocal, PETSC_DECIDE ,nghost, ighost, &petsc_dcsv));* > > blocksize and nv have the same value. nlocal is number of local cells and > nghost is number of halo cells. ighost contains the ghost cell index. > > Below is how I compute a matrix-vector product with a shell matrix > > * PetscErrorCode cFdDomain::mymult_slepc(Mat m ,Vec x, Vec y)* > * {* > * void *ctx;* > * cFdDomain *myctx;* > * PetscErrorCode ierr;* > > * MatShellGetContext(m, &ctx);* > * myctx = (cFdDomain*)ctx;* > > *//matrix-vector product* > * ierr = myctx->myfunc(x, y); CHKERRQ(ierr);* > > * ierr = 0;* > * return ierr;* > * }* > > > * PetscErrorCode cFdDomain::myfunc(Vec in, Vec out)* > * {* > > *//some declaration * > > * ierr = VecGetArray(petsc_dcsv,&array_g); CHKERRQ(ierr);* > * ierr = VecGetArrayRead(in, &array); CHKERRQ(ierr);* > > * //assign in to petsc_dcsv, only local cells* > * for(iv=0; iv * {* > * for(iq=0; iq * {* > * array_g[iv+nv*iq] = array[iv + nv*iq];* > * }* > * }* > > * ierr = VecRestoreArray(petsc_dcsv,&array_g); CHKERRQ(ierr);* > * ierr = VecRestoreArrayRead(in, &array); CHKERRQ(ierr);* > > * //update halo cells?* > * PetscCall(VecGhostUpdateBegin(petsc_dcsv, INSERT_VALUES, > SCATTER_FORWARD));* > * PetscCall(VecGhostUpdateEnd(petsc_dcsv, INSERT_VALUES, > SCATTER_FORWARD));* > * PetscCall(VecGhostGetLocalForm(petsc_dcsv,&veclocal));* > > *//read in v* > * ierr = VecGetArray(veclocal,&array_ghost); CHKERRQ(ierr);* > * for(iv=0; iv * {* > * for(iq=0; iq * {* > * jq = ilocal[iq];* > * dq[iv][jq] = array_ghost[iv + nv*iq];* > * }* > > * for(iq=nlocal; iq * {* > * jq = ighost_local[iq-nlocal];* > * dq[iv][jq] = array_ghost[iv + nv*iq];* > * }* > * }* > * ierr = VecRestoreArray(veclocal,&array_ghost); CHKERRQ(ierr);* > > > * //some computations * > > > * PetscCall(VecGhostRestoreLocalForm(petsc_dcsv,&veclocal)); * > * }* > > > so I fill the local part of the ghost vector *petsc_dcsv* for each rank > and then call ghost update, and think this will update the halo cells. it > seems not doing that. > I can only think you are misinterpreting the result. There are many examples, such src/vec/tutorials/ex9.c (and ex9f.F) I would start there and try to change that into the communication you want, since it definitely works. I cannot see a problem with the code snippet above. Thanks, Matt > Thanks, > Feng > > ------------------------------ > *From:* Matthew Knepley > *Sent:* 21 September 2022 14:36 > *To:* feng wang > *Cc:* Jose E. Roman ; petsc-users at mcs.anl.gov < > petsc-users at mcs.anl.gov> > *Subject:* Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > On Wed, Sep 21, 2022 at 10:35 AM feng wang wrote: > > Hi Jose, > > For your 2nd suggestion on halo exchange, I get the idea and roughly know > how to do it, but there are some implementation details which I am not > quite sure. > > If I understand it correctly, in MatMult(Mat m ,Vec x, Vec y), Vec *x* is > a normal parallel vector and it does not contain halo values. Suppose I > create an auxiliary ghost vector * x_g*, then I assign the values of *x* > to *x_g*. The values of the halo for each partition will not be assigned > at this stage. > > But If I call VecGhostUpdateBegin/End(*x_g*, INSERT_VALUES, > SCATTER_FORWARD), this will fill the values of the halo cells of *x_g *for > each partition. Then *x_g* has local and halo cells assigned correctly > and I can use *x_g* to do my computation. Is this what you mean? > > > Yes > > Matt > > > Thanks, > Feng > > ------------------------------ > *From:* Jose E. Roman > *Sent:* 21 September 2022 13:07 > *To:* feng wang > *Cc:* Matthew Knepley ; petsc-users at mcs.anl.gov < > petsc-users at mcs.anl.gov> > *Subject:* Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > > > > El 21 sept 2022, a las 14:47, feng wang > escribi?: > > > > Thanks Jose, I will try this and will come back to this thread if I have > any issue. > > > > Besides, for EPSGetEigenpair, I guess each rank gets its portion of the > eigenvector, and I need to put them together afterwards? > > Eigenvectors are stored in parallel vectors, which are used in subsequent > parallel computation in most applications. If for some reason you need to > gather them in a single MPI process you can use e.g. > VecScatterCreateToZero() > > > > > Thanks, > > Feng > > > > From: Jose E. Roman > > Sent: 21 September 2022 12:34 > > To: feng wang > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov < > petsc-users at mcs.anl.gov> > > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > > > If you define the MATOP_CREATE_VECS operation in your shell matrix so > that it creates a ghost vector, then all vectors within EPS will be ghost > vectors, including those that are received as arguments of MatMult(). Not > sure if this will work. > > > > A simpler solution is that you store a ghost vector in the context of > your shell matrix, and then in MatMult() you receive a regular parallel > vector x, then update the ghost points using the auxiliary ghost vector, do > the computation and store the result in the regular parallel vector y. > > > > Jose > > > > > > > El 21 sept 2022, a las 14:09, feng wang > escribi?: > > > > > > Thanks for your reply. > > > > > > For GMRES, I create a ghost vector and give it to KSPSolve. For Slepc, > it only takes the shell matrix for EPSSetOperators. Suppose the shell > matrix of the eigensolver defines MatMult(Mat m ,Vec x, Vec y), how does it > know Vec x is a ghost vector and how many ghost cells there are? > > > > > > Thanks, > > > Feng > > > From: Matthew Knepley > > > Sent: 21 September 2022 11:58 > > > To: feng wang > > > Cc: petsc-users at mcs.anl.gov > > > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > > > > > On Wed, Sep 21, 2022 at 7:41 AM feng wang > wrote: > > > Hello, > > > > > > I am using Slepc with a shell matrix. The sequential version seems > working and now I am trying to make it run in parallel. > > > > > > The partition of the domain is done, I am not sure how to do the halo > exchange in the shell matrix in Slepc. I have a parallel version of > matrix-free GMRES in my code with Petsc. I was using VecCreateGhostBlock to > create vector with ghost cells, and then used VecGhostUpdateBegin/End for > the halo exchange in the shell matrix, would this be the same for Slepc? > > > > > > That will be enough for the MatMult(). You would also have to use a > SLEPc EPS that only needed MatMult(). > > > > > > Thanks, > > > > > > Matt > > > > > > Thanks, > > > Feng > > > > > > > > > > > > > > > -- > > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > > -- Norbert Wiener > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Sun Oct 9 20:31:23 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Sun, 9 Oct 2022 20:31:23 -0500 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: References: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> <16EE4635-0A03-45AA-92AD-1926907F4B8E@petsc.dev> <878rlqx779.fsf@jedbrown.org> <599F31B3-BE61-4928-871F-D773289D5497@petsc.dev> Message-ID: In the last link step to generate the executable /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/ gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_ support/lib:/path/to/petsc/arch-linux-cxx-debug/lib libwtm.a common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so I did not find -lmpi to link in the mpi library. You can try to use cmake -DCMAKE_C_COMPILER=/path/to/mpicc -DCMAKE_CXX_COMPILER=/path/to/mpicxx to build your code On Sat, Oct 8, 2022 at 9:32 PM Rob Kudyba wrote: > Perhaps we can back one step: >> Use your mpicc to build a "hello world" mpi test, then run it on a >> compute node (with GPU) to see if it works. >> If no, then your MPI environment has problems; >> If yes, then use it to build petsc (turn on petsc's gpu support, >> --with-cuda --with-cudac=nvcc), and then your code. >> --Junchao Zhang > > OK tried this just to eliminate that the CUDA-capable OpenMPI is a factor: > ./configure --with-debugging=0 --with-cmake=true --with-mpi=true > --with-mpi-dir=/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support --with-fc=0 > --with-cuda=1 > [..] > cuda: > Version: 11.7 > Includes: -I/path/to/cuda11.7/toolkit/11.7.1/include > Libraries: -Wl,-rpath,/path/to/cuda11.7/toolkit/11.7.1/lib64 > -L/cm/shared/apps/cuda11.7/toolkit/11.7.1/lib64 > -L/path/to/cuda11.7/toolkit/11.7.1/lib64/stubs -lcudart -lnvToolsExt > -lcufft -lcublas -lcusparse -lcusolver -lcurand -lcuda > CUDA SM 75 > CUDA underlying compiler: > CUDA_CXX="/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/bin"/mpicxx > CUDA underlying compiler flags: CUDA_CXXFLAGS= > CUDA underlying linker libraries: CUDA_CXXLIBS= > [...] > Configure stage complete. Now build PETSc libraries with: > make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-opt all > > C++ compiler version: g++ (GCC) 10.2.0 > Using C++ compiler to compile PETSc > ----------------------------------------- > Using C/C++ linker: > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/bin/mpicxx > Using C/C++ flags: -Wall -Wwrite-strings -Wno-strict-aliasing > -Wno-unknown-pragmas -Wno-lto-type-mismatch -fstack-protector > -fvisibility=hidden -g -O0 > ----------------------------------------- > Using system modules: > shared:slurm/20.02.6:DefaultModules:openmpi/gcc/64/4.1.1_cuda_11.0.3_aware:gdal/3.3.0:cmake/3.22.1:cuda11.7/toolkit/11.7.1:openblas/dynamic/0.3.7:gcc/10.2.0 > Using mpi.h: # 1 > "/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include/mpi.h" 1 > ----------------------------------------- > Using libraries: -Wl,-rpath,/path/to/petsc/arch-linux-cxx-debug/lib > -L/path/to/petsc/arch-linux-cxx-debug/lib -lpetsc -lopenblas -lm -lX11 > -lquadmath -lstdc++ -ldl > ------------------------------------------ > Using mpiexec: mpiexec -mca orte_base_help_aggregate 0 -mca pml ucx --mca > btl '^openib' > ------------------------------------------ > Using MAKE: /path/to/petsc/arch-linux-cxx-debug/bin/make > Using MAKEFLAGS: -j24 -l48.0 --no-print-directory -- MPIEXEC=mpiexec\ > -mca\ orte_base_help_aggregate\ 0\ \ -mca\ pml\ ucx\ --mca\ btl\ '^openib' > PETSC_ARCH=arch-linux-cxx-debug PETSC_DIR=/path/to/petsc > ========================================== > make[3]: Nothing to be done for 'libs'. > ========================================= > Now to check if the libraries are working do: > make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-cxx-debug check > ========================================= > [me at xxx petsc]$ make PETSC_DIR=/path/to/petsc > PETSC_ARCH=arch-linux-cxx-debug MPIEXEC="mpiexec -mca > orte_base_help_aggregate 0 -mca pml ucx --mca btl '^openib'" check > Running check examples to verify correct installation > Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-cxx-debug > C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process > C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI processes > > ./bandwidthTest > [CUDA Bandwidth Test] - Starting... > Running on... > > Device 0: Quadro RTX 8000 > Quick Mode > > Host to Device Bandwidth, 1 Device(s) > PINNED Memory Transfers > Transfer Size (Bytes) Bandwidth(GB/s) > 32000000 12.3 > > Device to Host Bandwidth, 1 Device(s) > PINNED Memory Transfers > Transfer Size (Bytes) Bandwidth(GB/s) > 32000000 13.2 > > Device to Device Bandwidth, 1 Device(s) > PINNED Memory Transfers > Transfer Size (Bytes) Bandwidth(GB/s) > 32000000 466.2 > > Result = PASS > > On Sat, Oct 8, 2022 at 7:56 PM Barry Smith wrote: > >> >> True, but when users send reports back to us they will never have used >> the VERBOSE=1 option, so it requires one more round trip of email to get >> this additional information. >> >> > On Oct 8, 2022, at 6:48 PM, Jed Brown wrote: >> > >> > Barry Smith writes: >> > >> >> I hate these kinds of make rules that hide what the compiler is >> doing (in the name of having less output, I guess) it makes it difficult to >> figure out what is going wrong. >> > >> > You can make VERBOSE=1 with CMake-generated makefiles. >> > > >> Anyways, either some of the MPI libraries are missing from the link line >> or they are in the wrong order and thus it is not able to search them >> properly. Here is a bunch of discussions on why that error message can >> appear >> https://stackoverflow.com/questions/19901934/libpthread-so-0-error-adding-symbols-dso-missing-from-command-line >> > > > Still same but more noise and I have been using the suggestion of > LDFLAGS="-Wl,--copy-dt-needed-entries" along with make: > make[2]: Entering directory '/path/to/WTM/build' > cd /path/to/WTM/build && > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_depends "Unix > Makefiles" /path/to/WTM /path/to/WTM /path/to/WTM/build /path/to/WTM/build > /path/to/WTM/build/CMakeFiles/wtm.x.dir/DependInfo.cmake --color= > make[2]: Leaving directory '/path/to/WTM/build' > make -f CMakeFiles/wtm.x.dir/build.make CMakeFiles/wtm.x.dir/build > make[2]: Entering directory '/path/to/WTM/build' > [ 66%] Building CXX object CMakeFiles/wtm.x.dir/src/WTM.cpp.o > /cm/local/apps/gcc/10.2.0/bin/c++ -I/path/to/WTM/common/richdem/include > -I/path/to/gdal-3.3.0/include -I/path/to/WTM/common/fmt/include -isystem > /path/to/petsc/arch-linux-cxx-debug/include -isystem /path/to/petsc/include > -isystem -O3 -g -Wall -Wextra -pedantic -Wshadow -Wfloat-conversion -Wall > -Wextra -pedantic -Wshadow -DRICHDEM_GIT_HASH=\"xxx\" > -DRICHDEM_COMPILE_TIME=\"2022-10-09T02:21:11Z\" -DUSEGDAL -Xpreprocessor > -fopenmp > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 > -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++2a -MD > -MT CMakeFiles/wtm.x.dir/src/WTM.cpp.o -MF > CMakeFiles/wtm.x.dir/src/WTM.cpp.o.d -o CMakeFiles/wtm.x.dir/src/WTM.cpp.o > -c /path/to/WTM/src/WTM.cpp > c++: warning: > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: > linker input file unused because linking not done > [ 70%] Linking CXX executable wtm.x > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_link_script > CMakeFiles/wtm.x.dir/link.txt --verbose=1 > /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra -pedantic > -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x > -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib:/path/to/petsc/arch-linux-cxx-debug/lib > libwtm.a common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 > common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so > /usr/bin/ld: CMakeFiles/wtm.x.dir/src/WTM.cpp.o: undefined reference to > symbol 'ompi_mpi_comm_self' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40: error > adding symbols: DSO missing from command line > collect2: error: ld returned 1 exit status > make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 > make[2]: Leaving directory '/path/to/WTM/build' > make[1]: *** [CMakeFiles/Makefile2:225: CMakeFiles/wtm.x.dir/all] Error 2 > make[1]: Leaving directory '/path/to/WTM/build' > make: *** [Makefile:136: all] Error 2 > > Anything stick out? > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Sun Oct 9 21:04:18 2022 From: bsmith at petsc.dev (Barry Smith) Date: Sun, 9 Oct 2022 22:04:18 -0400 Subject: [petsc-users] PETSc 2023 User Meeting and New Release 3.18 Message-ID: <38C0A700-004E-41FE-9426-0946E0419E62@petsc.dev> We are pleased to announce the - next PETSc users meeting, June 5-7, 2023, in Chicago on the campus of IIT (mark your calendars, further information will be forthcoming soon) and the - release of PETSc version 3.18.0 at https://petsc.org/release/install/download/ . This release includes + far more extensive solvers on GPUs, including an efficient MatSetValuesCOO() for setting matrix values directly on the GPU + improved interfaces to Kokkos, so user GPU code can now be written in Kokkos, CUDA, or HIP. + vastly improved documentation at petsc.org with a more comprehensive user manual and better search capabilities (due to Patrick Sanan) We recommend upgrading to PETSc 3.18.0 soon. As always, please report problems to petsc-maint at mcs.anl.gov and ask questions at petsc-users at mcs.anl.gov A list of the major changes and updates can be found at https://petsc.org/release/docs/changes/318 The final update to petsc-3.17 i.e petsc-3.17.5 is also available This release includes contributions from AdelekeBankole Aidan Hamilton Albert Cowie Alexis Marboeuf Barry Smith Blaise Bourdin Dave May David Andrs David Wells Fande Kong ftrigaux Getnet Betrie Hong Zhang Jacob Faibussowitsch James Wright JDBetteridge Jed Brown Jeremy L Thompson Joe Wallwork Jose Roman Junchao Zhang Justin Chang Kaushik Kulkarni Kerry Key Koki Sagiyama Lawrence Mitchell Lisandro Dalcin Mark Adams Martin Diehl Matthew Knepley Matthew Woehlk Min RK Mr. Hong Zhang Patrick Farrell Patrick Sanan Pierre Jolivet Richard Tran Mills Romain Beucher Satish Balay Scott Kruger Stefano Zampini suyashtn Toby Isaac Todd Munson Umberto Zerbinati Vaclav Hapla Zongze Yang and bug reports/proposed improvements received from Brad Aagaard Abylay Zhumekenov Aidan Hamilton Alfredo J Duarte Gomez Bro H Collins, Eric Benjamin Dudson Ed Bueler Erhan Turan Eric Chamberland Fackler, Philip flw at rzg.mpg.de Glenn Hammond Henrik B?sing Jacob Simon Merson Jesper Lund-Carlson Jin Chen John Snyder Jose E. Roman Kaustubh Khedkar Lisandro Dalcin Lucas Banting Matthew Knepley Nicolas Berlie Nicolas Tardieu Robert Nourgaliev Olivier Jamond Patrick Sanan Pierre Jolivet Rafel Amer Ramon Randall J LeVeque Richard F. Katz Sanjay Govindjee san.temporal at gmail.com Sidarth Narayanan Stefano Zampini TAY wee-beng Victor Eijkhout Victoria Hamtiaux Xiaoye S. Li Xu Hui Yang Liu Ye Changqing Zakariae Jorti -------------- next part -------------- An HTML attachment was scrubbed... URL: From rk3199 at columbia.edu Sun Oct 9 21:28:42 2022 From: rk3199 at columbia.edu (Rob Kudyba) Date: Sun, 9 Oct 2022 22:28:42 -0400 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: References: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> <16EE4635-0A03-45AA-92AD-1926907F4B8E@petsc.dev> <878rlqx779.fsf@jedbrown.org> <599F31B3-BE61-4928-871F-D773289D5497@petsc.dev> Message-ID: I did have -DMPI_CXX_COMPILER set, so I added -DCMAKE_C_COMPILER and now get these errors: [ 25%] Linking CXX shared library librichdem.so /lib/../lib64/crt1.o: In function `_start': (.text+0x24): undefined reference to `main' CMakeFiles/richdem.dir/src/random.cpp.o: In function `richdem::rand_engine()': random.cpp:(.text+0x45): undefined reference to `omp_get_thread_num' CMakeFiles/richdem.dir/src/random.cpp.o: In function `richdem::seed_rand(unsigned long)': random.cpp:(.text+0xb6): undefined reference to `GOMP_parallel' CMakeFiles/richdem.dir/src/random.cpp.o: In function `richdem::uniform_rand_int(int, int)': random.cpp:(.text+0x10c): undefined reference to `omp_get_thread_num' CMakeFiles/richdem.dir/src/random.cpp.o: In function `richdem::uniform_rand_real(double, double)': random.cpp:(.text+0x1cb): undefined reference to `omp_get_thread_num' CMakeFiles/richdem.dir/src/random.cpp.o: In function `richdem::normal_rand(double, double)': random.cpp:(.text+0x29e): undefined reference to `omp_get_thread_num' CMakeFiles/richdem.dir/src/random.cpp.o: In function `richdem::seed_rand(unsigned long) [clone ._omp_fn.0]': random.cpp:(.text+0x4a3): undefined reference to `GOMP_critical_start' random.cpp:(.text+0x4b1): undefined reference to `GOMP_critical_end' random.cpp:(.text+0x4c3): undefined reference to `omp_get_thread_num' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Comm_rank' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Get_address' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Comm_get_name' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Add_error_string' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Type_get_name' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Abort' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Alloc_mem' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Isend' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Barrier' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Allgather' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Reduce' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Send' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Init' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Type_size' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Accumulate' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Add_error_class' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Finalize' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Allgatherv' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Bcast' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Recv' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Request_free' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Allreduce' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `ompi_mpi_comm_world' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Sendrecv' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Add_error_code' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Win_get_name' collect2: error: ld returned 1 exit status make[2]: *** [common/richdem/CMakeFiles/richdem.dir/build.make:163: common/richdem/librichdem.so] Error 1 make[1]: *** [CMakeFiles/Makefile2:306: common/richdem/CMakeFiles/richdem.dir/all] Error 2 make: *** [Makefile:136: all] Error 2 I took a guess at using -DOpenMP_libomp_LIBRARY="/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0" as as otherwise I'd get: CMake Error at /path/to/cmake/cmake-3.22.1-linux-x86_64/share/cmake-3.22/Modules/FindPackageHandleStandardArgs.cmake:230 (message): Could NOT find OpenMP_CXX (missing: OpenMP_libomp_LIBRARY OpenMP_libomp_LIBRARY) (found version "4.5") So perhaps that's the real problem? On Sun, Oct 9, 2022 at 9:31 PM Junchao Zhang wrote: > In the last link step to generate the executable > /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra -pedantic > -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x > -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/ > gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_ > support/lib:/path/to/petsc/arch-linux-cxx-debug/lib libwtm.a > common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 > common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so > > I did not find -lmpi to link in the mpi library. You can try to use cmake > -DCMAKE_C_COMPILER=/path/to/mpicc -DCMAKE_CXX_COMPILER=/path/to/mpicxx to > build your code > > On Sat, Oct 8, 2022 at 9:32 PM Rob Kudyba wrote: > >> Perhaps we can back one step: >>> Use your mpicc to build a "hello world" mpi test, then run it on a >>> compute node (with GPU) to see if it works. >>> If no, then your MPI environment has problems; >>> If yes, then use it to build petsc (turn on petsc's gpu support, >>> --with-cuda --with-cudac=nvcc), and then your code. >>> --Junchao Zhang >> >> OK tried this just to eliminate that the CUDA-capable OpenMPI is a factor: >> ./configure --with-debugging=0 --with-cmake=true --with-mpi=true >> --with-mpi-dir=/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support --with-fc=0 >> --with-cuda=1 >> [..] >> cuda: >> Version: 11.7 >> Includes: -I/path/to/cuda11.7/toolkit/11.7.1/include >> Libraries: -Wl,-rpath,/path/to/cuda11.7/toolkit/11.7.1/lib64 >> -L/cm/shared/apps/cuda11.7/toolkit/11.7.1/lib64 >> -L/path/to/cuda11.7/toolkit/11.7.1/lib64/stubs -lcudart -lnvToolsExt >> -lcufft -lcublas -lcusparse -lcusolver -lcurand -lcuda >> CUDA SM 75 >> CUDA underlying compiler: >> CUDA_CXX="/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/bin"/mpicxx >> CUDA underlying compiler flags: CUDA_CXXFLAGS= >> CUDA underlying linker libraries: CUDA_CXXLIBS= >> [...] >> Configure stage complete. Now build PETSc libraries with: >> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-opt all >> >> C++ compiler version: g++ (GCC) 10.2.0 >> Using C++ compiler to compile PETSc >> ----------------------------------------- >> Using C/C++ linker: >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/bin/mpicxx >> Using C/C++ flags: -Wall -Wwrite-strings -Wno-strict-aliasing >> -Wno-unknown-pragmas -Wno-lto-type-mismatch -fstack-protector >> -fvisibility=hidden -g -O0 >> ----------------------------------------- >> Using system modules: >> shared:slurm/20.02.6:DefaultModules:openmpi/gcc/64/4.1.1_cuda_11.0.3_aware:gdal/3.3.0:cmake/3.22.1:cuda11.7/toolkit/11.7.1:openblas/dynamic/0.3.7:gcc/10.2.0 >> Using mpi.h: # 1 >> "/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include/mpi.h" 1 >> ----------------------------------------- >> Using libraries: -Wl,-rpath,/path/to/petsc/arch-linux-cxx-debug/lib >> -L/path/to/petsc/arch-linux-cxx-debug/lib -lpetsc -lopenblas -lm -lX11 >> -lquadmath -lstdc++ -ldl >> ------------------------------------------ >> Using mpiexec: mpiexec -mca orte_base_help_aggregate 0 -mca pml ucx >> --mca btl '^openib' >> ------------------------------------------ >> Using MAKE: /path/to/petsc/arch-linux-cxx-debug/bin/make >> Using MAKEFLAGS: -j24 -l48.0 --no-print-directory -- MPIEXEC=mpiexec\ >> -mca\ orte_base_help_aggregate\ 0\ \ -mca\ pml\ ucx\ --mca\ btl\ '^openib' >> PETSC_ARCH=arch-linux-cxx-debug PETSC_DIR=/path/to/petsc >> ========================================== >> make[3]: Nothing to be done for 'libs'. >> ========================================= >> Now to check if the libraries are working do: >> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-cxx-debug check >> ========================================= >> [me at xxx petsc]$ make PETSC_DIR=/path/to/petsc >> PETSC_ARCH=arch-linux-cxx-debug MPIEXEC="mpiexec -mca >> orte_base_help_aggregate 0 -mca pml ucx --mca btl '^openib'" check >> Running check examples to verify correct installation >> Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-cxx-debug >> C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process >> C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI >> processes >> >> ./bandwidthTest >> [CUDA Bandwidth Test] - Starting... >> Running on... >> >> Device 0: Quadro RTX 8000 >> Quick Mode >> >> Host to Device Bandwidth, 1 Device(s) >> PINNED Memory Transfers >> Transfer Size (Bytes) Bandwidth(GB/s) >> 32000000 12.3 >> >> Device to Host Bandwidth, 1 Device(s) >> PINNED Memory Transfers >> Transfer Size (Bytes) Bandwidth(GB/s) >> 32000000 13.2 >> >> Device to Device Bandwidth, 1 Device(s) >> PINNED Memory Transfers >> Transfer Size (Bytes) Bandwidth(GB/s) >> 32000000 466.2 >> >> Result = PASS >> >> On Sat, Oct 8, 2022 at 7:56 PM Barry Smith wrote: >> >>> >>> True, but when users send reports back to us they will never have used >>> the VERBOSE=1 option, so it requires one more round trip of email to get >>> this additional information. >>> >>> > On Oct 8, 2022, at 6:48 PM, Jed Brown wrote: >>> > >>> > Barry Smith writes: >>> > >>> >> I hate these kinds of make rules that hide what the compiler is >>> doing (in the name of having less output, I guess) it makes it difficult to >>> figure out what is going wrong. >>> > >>> > You can make VERBOSE=1 with CMake-generated makefiles. >>> >> >> >>> Anyways, either some of the MPI libraries are missing from the link line >>> or they are in the wrong order and thus it is not able to search them >>> properly. Here is a bunch of discussions on why that error message can >>> appear >>> https://stackoverflow.com/questions/19901934/libpthread-so-0-error-adding-symbols-dso-missing-from-command-line >>> >> >> >> Still same but more noise and I have been using the suggestion of >> LDFLAGS="-Wl,--copy-dt-needed-entries" along with make: >> make[2]: Entering directory '/path/to/WTM/build' >> cd /path/to/WTM/build && >> /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_depends "Unix >> Makefiles" /path/to/WTM /path/to/WTM /path/to/WTM/build /path/to/WTM/build >> /path/to/WTM/build/CMakeFiles/wtm.x.dir/DependInfo.cmake --color= >> make[2]: Leaving directory '/path/to/WTM/build' >> make -f CMakeFiles/wtm.x.dir/build.make CMakeFiles/wtm.x.dir/build >> make[2]: Entering directory '/path/to/WTM/build' >> [ 66%] Building CXX object CMakeFiles/wtm.x.dir/src/WTM.cpp.o >> /cm/local/apps/gcc/10.2.0/bin/c++ -I/path/to/WTM/common/richdem/include >> -I/path/to/gdal-3.3.0/include -I/path/to/WTM/common/fmt/include -isystem >> /path/to/petsc/arch-linux-cxx-debug/include -isystem /path/to/petsc/include >> -isystem -O3 -g -Wall -Wextra -pedantic -Wshadow -Wfloat-conversion -Wall >> -Wextra -pedantic -Wshadow -DRICHDEM_GIT_HASH=\"xxx\" >> -DRICHDEM_COMPILE_TIME=\"2022-10-09T02:21:11Z\" -DUSEGDAL -Xpreprocessor >> -fopenmp >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 >> -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++2a -MD >> -MT CMakeFiles/wtm.x.dir/src/WTM.cpp.o -MF >> CMakeFiles/wtm.x.dir/src/WTM.cpp.o.d -o CMakeFiles/wtm.x.dir/src/WTM.cpp.o >> -c /path/to/WTM/src/WTM.cpp >> c++: warning: >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: >> linker input file unused because linking not done >> [ 70%] Linking CXX executable wtm.x >> /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_link_script >> CMakeFiles/wtm.x.dir/link.txt --verbose=1 >> /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra -pedantic >> -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x >> -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib:/path/to/petsc/arch-linux-cxx-debug/lib >> libwtm.a common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 >> common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so >> /usr/bin/ld: CMakeFiles/wtm.x.dir/src/WTM.cpp.o: undefined reference to >> symbol 'ompi_mpi_comm_self' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40: error >> adding symbols: DSO missing from command line >> collect2: error: ld returned 1 exit status >> make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 >> make[2]: Leaving directory '/path/to/WTM/build' >> make[1]: *** [CMakeFiles/Makefile2:225: CMakeFiles/wtm.x.dir/all] Error 2 >> make[1]: Leaving directory '/path/to/WTM/build' >> make: *** [Makefile:136: all] Error 2 >> >> Anything stick out? >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Sun Oct 9 22:02:00 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Sun, 9 Oct 2022 22:02:00 -0500 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: References: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> <16EE4635-0A03-45AA-92AD-1926907F4B8E@petsc.dev> <878rlqx779.fsf@jedbrown.org> <599F31B3-BE61-4928-871F-D773289D5497@petsc.dev> Message-ID: OK, let's walk back and don't use -DCMAKE_C_COMPILER=/path/to/mpicc libompitrace.so.40.30.0 is not the OpenMP library; it is the tracing library for OpenMPI, https://github.com/open-mpi/ompi/issues/10036 In your previous email, there was /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_link_script CMakeFiles/wtm.x.dir/link.txt --verbose=1 /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/ gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_ support/lib:/path/to/petsc/arch-linux-cxx-debug/lib libwtm.a common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so /usr/bin/ld: CMakeFiles/wtm.x.dir/src/WTM.cpp.o: undefined reference to symbol 'ompi_mpi_comm_self' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40: error adding symbols: DSO missing from command line Let's try to add -lmpi (or /path/to/openmpi-4.1.1_ucx_ cuda_11.0.3_support/lib/libmpi.so) manually to see if it links /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/ gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_ support/lib:/path/to/petsc/arch-linux-cxx-debug/lib libwtm.a common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so -lmpi On Sun, Oct 9, 2022 at 9:28 PM Rob Kudyba wrote: > I did have -DMPI_CXX_COMPILER set, so I added -DCMAKE_C_COMPILER and now > get these errors: > > [ 25%] Linking CXX shared library librichdem.so > /lib/../lib64/crt1.o: In function `_start': > (.text+0x24): undefined reference to `main' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::rand_engine()': > random.cpp:(.text+0x45): undefined reference to `omp_get_thread_num' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::seed_rand(unsigned long)': > random.cpp:(.text+0xb6): undefined reference to `GOMP_parallel' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::uniform_rand_int(int, int)': > random.cpp:(.text+0x10c): undefined reference to `omp_get_thread_num' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::uniform_rand_real(double, double)': > random.cpp:(.text+0x1cb): undefined reference to `omp_get_thread_num' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::normal_rand(double, double)': > random.cpp:(.text+0x29e): undefined reference to `omp_get_thread_num' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::seed_rand(unsigned long) [clone ._omp_fn.0]': > random.cpp:(.text+0x4a3): undefined reference to `GOMP_critical_start' > random.cpp:(.text+0x4b1): undefined reference to `GOMP_critical_end' > random.cpp:(.text+0x4c3): undefined reference to `omp_get_thread_num' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Comm_rank' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Get_address' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Comm_get_name' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Add_error_string' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Type_get_name' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Abort' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Alloc_mem' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Isend' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Barrier' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Allgather' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Reduce' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Send' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Init' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Type_size' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Accumulate' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Add_error_class' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Finalize' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Allgatherv' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Bcast' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Recv' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Request_free' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Allreduce' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `ompi_mpi_comm_world' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Sendrecv' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Add_error_code' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Win_get_name' > collect2: error: ld returned 1 exit status > make[2]: *** [common/richdem/CMakeFiles/richdem.dir/build.make:163: > common/richdem/librichdem.so] Error 1 > make[1]: *** [CMakeFiles/Makefile2:306: > common/richdem/CMakeFiles/richdem.dir/all] Error 2 > make: *** [Makefile:136: all] Error 2 > > I took a guess at using -DOpenMP_libomp_LIBRARY="/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0" > as > as otherwise I'd get: > CMake Error at > /path/to/cmake/cmake-3.22.1-linux-x86_64/share/cmake-3.22/Modules/FindPackageHandleStandardArgs.cmake:230 > (message): > Could NOT find OpenMP_CXX (missing: OpenMP_libomp_LIBRARY > OpenMP_libomp_LIBRARY) (found version "4.5") > > So perhaps that's the real problem? > > On Sun, Oct 9, 2022 at 9:31 PM Junchao Zhang > wrote: > >> In the last link step to generate the executable >> /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra >> -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x >> -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/ >> gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_ >> support/lib:/path/to/petsc/arch-linux-cxx-debug/lib libwtm.a >> common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 >> common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so >> >> I did not find -lmpi to link in the mpi library. You can try to use cmake >> -DCMAKE_C_COMPILER=/path/to/mpicc -DCMAKE_CXX_COMPILER=/path/to/mpicxx to >> build your code >> >> On Sat, Oct 8, 2022 at 9:32 PM Rob Kudyba wrote: >> >>> Perhaps we can back one step: >>>> Use your mpicc to build a "hello world" mpi test, then run it on a >>>> compute node (with GPU) to see if it works. >>>> If no, then your MPI environment has problems; >>>> If yes, then use it to build petsc (turn on petsc's gpu support, >>>> --with-cuda --with-cudac=nvcc), and then your code. >>>> --Junchao Zhang >>> >>> OK tried this just to eliminate that the CUDA-capable OpenMPI is a >>> factor: >>> ./configure --with-debugging=0 --with-cmake=true --with-mpi=true >>> --with-mpi-dir=/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support --with-fc=0 >>> --with-cuda=1 >>> [..] >>> cuda: >>> Version: 11.7 >>> Includes: -I/path/to/cuda11.7/toolkit/11.7.1/include >>> Libraries: -Wl,-rpath,/path/to/cuda11.7/toolkit/11.7.1/lib64 >>> -L/cm/shared/apps/cuda11.7/toolkit/11.7.1/lib64 >>> -L/path/to/cuda11.7/toolkit/11.7.1/lib64/stubs -lcudart -lnvToolsExt >>> -lcufft -lcublas -lcusparse -lcusolver -lcurand -lcuda >>> CUDA SM 75 >>> CUDA underlying compiler: >>> CUDA_CXX="/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/bin"/mpicxx >>> CUDA underlying compiler flags: CUDA_CXXFLAGS= >>> CUDA underlying linker libraries: CUDA_CXXLIBS= >>> [...] >>> Configure stage complete. Now build PETSc libraries with: >>> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-opt all >>> >>> C++ compiler version: g++ (GCC) 10.2.0 >>> Using C++ compiler to compile PETSc >>> ----------------------------------------- >>> Using C/C++ linker: >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/bin/mpicxx >>> Using C/C++ flags: -Wall -Wwrite-strings -Wno-strict-aliasing >>> -Wno-unknown-pragmas -Wno-lto-type-mismatch -fstack-protector >>> -fvisibility=hidden -g -O0 >>> ----------------------------------------- >>> Using system modules: >>> shared:slurm/20.02.6:DefaultModules:openmpi/gcc/64/4.1.1_cuda_11.0.3_aware:gdal/3.3.0:cmake/3.22.1:cuda11.7/toolkit/11.7.1:openblas/dynamic/0.3.7:gcc/10.2.0 >>> Using mpi.h: # 1 >>> "/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include/mpi.h" 1 >>> ----------------------------------------- >>> Using libraries: -Wl,-rpath,/path/to/petsc/arch-linux-cxx-debug/lib >>> -L/path/to/petsc/arch-linux-cxx-debug/lib -lpetsc -lopenblas -lm -lX11 >>> -lquadmath -lstdc++ -ldl >>> ------------------------------------------ >>> Using mpiexec: mpiexec -mca orte_base_help_aggregate 0 -mca pml ucx >>> --mca btl '^openib' >>> ------------------------------------------ >>> Using MAKE: /path/to/petsc/arch-linux-cxx-debug/bin/make >>> Using MAKEFLAGS: -j24 -l48.0 --no-print-directory -- MPIEXEC=mpiexec\ >>> -mca\ orte_base_help_aggregate\ 0\ \ -mca\ pml\ ucx\ --mca\ btl\ '^openib' >>> PETSC_ARCH=arch-linux-cxx-debug PETSC_DIR=/path/to/petsc >>> ========================================== >>> make[3]: Nothing to be done for 'libs'. >>> ========================================= >>> Now to check if the libraries are working do: >>> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-cxx-debug check >>> ========================================= >>> [me at xxx petsc]$ make PETSC_DIR=/path/to/petsc >>> PETSC_ARCH=arch-linux-cxx-debug MPIEXEC="mpiexec -mca >>> orte_base_help_aggregate 0 -mca pml ucx --mca btl '^openib'" check >>> Running check examples to verify correct installation >>> Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-cxx-debug >>> C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process >>> C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI >>> processes >>> >>> ./bandwidthTest >>> [CUDA Bandwidth Test] - Starting... >>> Running on... >>> >>> Device 0: Quadro RTX 8000 >>> Quick Mode >>> >>> Host to Device Bandwidth, 1 Device(s) >>> PINNED Memory Transfers >>> Transfer Size (Bytes) Bandwidth(GB/s) >>> 32000000 12.3 >>> >>> Device to Host Bandwidth, 1 Device(s) >>> PINNED Memory Transfers >>> Transfer Size (Bytes) Bandwidth(GB/s) >>> 32000000 13.2 >>> >>> Device to Device Bandwidth, 1 Device(s) >>> PINNED Memory Transfers >>> Transfer Size (Bytes) Bandwidth(GB/s) >>> 32000000 466.2 >>> >>> Result = PASS >>> >>> On Sat, Oct 8, 2022 at 7:56 PM Barry Smith wrote: >>> >>>> >>>> True, but when users send reports back to us they will never have >>>> used the VERBOSE=1 option, so it requires one more round trip of email to >>>> get this additional information. >>>> >>>> > On Oct 8, 2022, at 6:48 PM, Jed Brown wrote: >>>> > >>>> > Barry Smith writes: >>>> > >>>> >> I hate these kinds of make rules that hide what the compiler is >>>> doing (in the name of having less output, I guess) it makes it difficult to >>>> figure out what is going wrong. >>>> > >>>> > You can make VERBOSE=1 with CMake-generated makefiles. >>>> >>> >>> >>>> Anyways, either some of the MPI libraries are missing from the link >>>> line or they are in the wrong order and thus it is not able to search them >>>> properly. Here is a bunch of discussions on why that error message can >>>> appear >>>> https://stackoverflow.com/questions/19901934/libpthread-so-0-error-adding-symbols-dso-missing-from-command-line >>>> >>> >>> >>> Still same but more noise and I have been using the suggestion of >>> LDFLAGS="-Wl,--copy-dt-needed-entries" along with make: >>> make[2]: Entering directory '/path/to/WTM/build' >>> cd /path/to/WTM/build && >>> /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_depends "Unix >>> Makefiles" /path/to/WTM /path/to/WTM /path/to/WTM/build /path/to/WTM/build >>> /path/to/WTM/build/CMakeFiles/wtm.x.dir/DependInfo.cmake --color= >>> make[2]: Leaving directory '/path/to/WTM/build' >>> make -f CMakeFiles/wtm.x.dir/build.make CMakeFiles/wtm.x.dir/build >>> make[2]: Entering directory '/path/to/WTM/build' >>> [ 66%] Building CXX object CMakeFiles/wtm.x.dir/src/WTM.cpp.o >>> /cm/local/apps/gcc/10.2.0/bin/c++ -I/path/to/WTM/common/richdem/include >>> -I/path/to/gdal-3.3.0/include -I/path/to/WTM/common/fmt/include -isystem >>> /path/to/petsc/arch-linux-cxx-debug/include -isystem /path/to/petsc/include >>> -isystem -O3 -g -Wall -Wextra -pedantic -Wshadow -Wfloat-conversion -Wall >>> -Wextra -pedantic -Wshadow -DRICHDEM_GIT_HASH=\"xxx\" >>> -DRICHDEM_COMPILE_TIME=\"2022-10-09T02:21:11Z\" -DUSEGDAL -Xpreprocessor >>> -fopenmp >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 >>> -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++2a -MD >>> -MT CMakeFiles/wtm.x.dir/src/WTM.cpp.o -MF >>> CMakeFiles/wtm.x.dir/src/WTM.cpp.o.d -o CMakeFiles/wtm.x.dir/src/WTM.cpp.o >>> -c /path/to/WTM/src/WTM.cpp >>> c++: warning: >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: >>> linker input file unused because linking not done >>> [ 70%] Linking CXX executable wtm.x >>> /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_link_script >>> CMakeFiles/wtm.x.dir/link.txt --verbose=1 >>> /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra >>> -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x >>> -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib:/path/to/petsc/arch-linux-cxx-debug/lib >>> libwtm.a common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 >>> common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so >>> /usr/bin/ld: CMakeFiles/wtm.x.dir/src/WTM.cpp.o: undefined reference to >>> symbol 'ompi_mpi_comm_self' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40: error >>> adding symbols: DSO missing from command line >>> collect2: error: ld returned 1 exit status >>> make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 >>> make[2]: Leaving directory '/path/to/WTM/build' >>> make[1]: *** [CMakeFiles/Makefile2:225: CMakeFiles/wtm.x.dir/all] Error 2 >>> make[1]: Leaving directory '/path/to/WTM/build' >>> make: *** [Makefile:136: all] Error 2 >>> >>> Anything stick out? >>> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From rk3199 at columbia.edu Mon Oct 10 08:12:53 2022 From: rk3199 at columbia.edu (Rob Kudyba) Date: Mon, 10 Oct 2022 09:12:53 -0400 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: References: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> <16EE4635-0A03-45AA-92AD-1926907F4B8E@petsc.dev> <878rlqx779.fsf@jedbrown.org> <599F31B3-BE61-4928-871F-D773289D5497@petsc.dev> Message-ID: > > OK, let's walk back and don't use -DCMAKE_C_COMPILER=/path/to/mpicc > Will do > libompitrace.so.40.30.0 is not the OpenMP library; it is the tracing > library for OpenMPI, https://github.com/open-mpi/ompi/issues/10036 > Does that mean I should remove this option in the cmake command? > In your previous email, there was > > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_link_script > CMakeFiles/wtm.x.dir/link.txt --verbose=1 > /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra -pedantic > -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x > -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/ > gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_ > support/lib:/path/to/petsc/arch-linux-cxx-debug/lib libwtm.a > common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 > common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so > /usr/bin/ld: CMakeFiles/wtm.x.dir/src/WTM.cpp.o: undefined reference to > symbol 'ompi_mpi_comm_self' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40: error > adding symbols: DSO missing from command line > > > Let's try to add -lmpi (or /path/to/openmpi-4.1.1_ucx_ > cuda_11.0.3_support/lib/libmpi.so) manually to see if it links > > /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra -pedantic > -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x > -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/ > gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_ > support/lib:/path/to/petsc/arch-linux-cxx-debug/lib libwtm.a > common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 > common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so > -lmpi > so just adding that to the make command? Sttil seeing linking errors: make VERBOSE=1 LDFLAGS="-Wl,--copy-dt-needed-entries" -lmpi /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -S/path/to/WTM -B/path/to/WTM/build --check-build-system CMakeFiles/Makefile.cmake 0 /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_progress_start /path/to/WTM/build/CMakeFiles /path/to/WTM/build//CMakeFiles/progress.marks make -f CMakeFiles/Makefile2 all make[1]: Entering directory '/path/to/WTM/build' make -f common/fmt/CMakeFiles/fmt.dir/build.make common/fmt/CMakeFiles/fmt.dir/depend make[2]: Entering directory '/path/to/WTM/build' cd /path/to/WTM/build && /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_depends "Unix Makefiles" /path/to/WTM /path/to/WTM/common/fmt /path/to/WTM/build /path/to/WTM/build/common/fmt /path/to/WTM/build/common/fmt/CMakeFiles/fmt.dir/DependInfo.cmake --color= make[2]: Leaving directory '/path/to/WTM/build' make -f common/fmt/CMakeFiles/fmt.dir/build.make common/fmt/CMakeFiles/fmt.dir/build make[2]: Entering directory '/path/to/WTM/build' [ 4%] Building CXX object common/fmt/CMakeFiles/fmt.dir/src/format.cc.o cd /path/to/WTM/build/common/fmt && /cm/local/apps/gcc/10.2.0/bin/c++ -I/path/to/WTM/common/fmt/include -isystem -std=gnu++11 -MD -MT common/fmt/CMakeFiles/fmt.dir/src/format.cc.o -MF CMakeFiles/fmt.dir/src/format.cc.o.d -o CMakeFiles/fmt.dir/src/format.cc.o -c /path/to/WTM/common/fmt/src/format.cc [ 8%] Building CXX object common/fmt/CMakeFiles/fmt.dir/src/os.cc.o cd /path/to/WTM/build/common/fmt && /cm/local/apps/gcc/10.2.0/bin/c++ -I/path/to/WTM/common/fmt/include -isystem -std=gnu++11 -MD -MT common/fmt/CMakeFiles/fmt.dir/src/os.cc.o -MF CMakeFiles/fmt.dir/src/os.cc.o.d -o CMakeFiles/fmt.dir/src/os.cc.o -c /path/to/WTM/common/fmt/src/os.cc [ 12%] Linking CXX static library libfmt.a cd /path/to/WTM/build/common/fmt && /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -P CMakeFiles/fmt.dir/cmake_clean_target.cmake cd /path/to/WTM/build/common/fmt && /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_link_script CMakeFiles/fmt.dir/link.txt --verbose=1 /usr/bin/ar qc libfmt.a CMakeFiles/fmt.dir/src/format.cc.o CMakeFiles/fmt.dir/src/os.cc.o /usr/bin/ranlib libfmt.a make[2]: Leaving directory '/path/to/WTM/build' [ 12%] Built target fmt make -f common/richdem/CMakeFiles/richdem.dir/build.make common/richdem/CMakeFiles/richdem.dir/depend make[2]: Entering directory '/path/to/WTM/build' cd /path/to/WTM/build && /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_depends "Unix Makefiles" /path/to/WTM /path/to/WTM/common/richdem /path/to/WTM/build /path/to/WTM/build/common/richdem /path/to/WTM/build/common/richdem/CMakeFiles/richdem.dir/DependInfo.cmake --color= make[2]: Leaving directory '/path/to/WTM/build' make -f common/richdem/CMakeFiles/richdem.dir/build.make common/richdem/CMakeFiles/richdem.dir/build make[2]: Entering directory '/path/to/WTM/build' [ 16%] Building CXX object common/richdem/CMakeFiles/richdem.dir/src/richdem.cpp.o cd /path/to/WTM/build/common/richdem && /cm/local/apps/gcc/10.2.0/bin/c++ -Drichdem_EXPORTS -I/path/to/WTM/common/richdem/include -I/path/to/gdal-3.3.0/include -isystem -fPIC -DRICHDEM_GIT_HASH=\"3313b290725509d694da1fba83d0f32cca68cc70\" -DRICHDEM_COMPILE_TIME=\"2022-10-10T13:09:39Z\" -DUSEGDAL -Xpreprocessor -fopenmp /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++17 -MD -MT common/richdem/CMakeFiles/richdem.dir/src/richdem.cpp.o -MF CMakeFiles/richdem.dir/src/richdem.cpp.o.d -o CMakeFiles/richdem.dir/src/richdem.cpp.o -c /path/to/WTM/common/richdem/src/richdem.cpp c++: warning: /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: linker input file unused because linking not done [ 20%] Building CXX object common/richdem/CMakeFiles/richdem.dir/src/random.cpp.o cd /path/to/WTM/build/common/richdem && /cm/local/apps/gcc/10.2.0/bin/c++ -Drichdem_EXPORTS -I/path/to/WTM/common/richdem/include -I/path/to/gdal-3.3.0/include -isystem -fPIC -DRICHDEM_GIT_HASH=\"3313b290725509d694da1fba83d0f32cca68cc70\" -DRICHDEM_COMPILE_TIME=\"2022-10-10T13:09:39Z\" -DUSEGDAL -Xpreprocessor -fopenmp /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++17 -MD -MT common/richdem/CMakeFiles/richdem.dir/src/random.cpp.o -MF CMakeFiles/richdem.dir/src/random.cpp.o.d -o CMakeFiles/richdem.dir/src/random.cpp.o -c /path/to/WTM/common/richdem/src/random.cpp c++: warning: /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: linker input file unused because linking not done [ 25%] Building CXX object common/richdem/CMakeFiles/richdem.dir/src/gdal.cpp.o cd /path/to/WTM/build/common/richdem && /cm/local/apps/gcc/10.2.0/bin/c++ -Drichdem_EXPORTS -I/path/to/WTM/common/richdem/include -I/path/to/gdal-3.3.0/include -isystem -fPIC -DRICHDEM_GIT_HASH=\"3313b290725509d694da1fba83d0f32cca68cc70\" -DRICHDEM_COMPILE_TIME=\"2022-10-10T13:09:39Z\" -DUSEGDAL -Xpreprocessor -fopenmp /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++17 -MD -MT common/richdem/CMakeFiles/richdem.dir/src/gdal.cpp.o -MF CMakeFiles/richdem.dir/src/gdal.cpp.o.d -o CMakeFiles/richdem.dir/src/gdal.cpp.o -c /path/to/WTM/common/richdem/src/gdal.cpp c++: warning: /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: linker input file unused because linking not done [ 29%] Building CXX object common/richdem/CMakeFiles/richdem.dir/src/terrain_generation/terrain_generation.cpp.o cd /path/to/WTM/build/common/richdem && /cm/local/apps/gcc/10.2.0/bin/c++ -Drichdem_EXPORTS -I/path/to/WTM/common/richdem/include -I/path/to/gdal-3.3.0/include -isystem -fPIC -DRICHDEM_GIT_HASH=\"3313b290725509d694da1fba83d0f32cca68cc70\" -DRICHDEM_COMPILE_TIME=\"2022-10-10T13:09:39Z\" -DUSEGDAL -Xpreprocessor -fopenmp /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++17 -MD -MT common/richdem/CMakeFiles/richdem.dir/src/terrain_generation/terrain_generation.cpp.o -MF CMakeFiles/richdem.dir/src/terrain_generation/terrain_generation.cpp.o.d -o CMakeFiles/richdem.dir/src/terrain_generation/terrain_generation.cpp.o -c /path/to/WTM/common/richdem/src/terrain_generation/terrain_generation.cpp c++: warning: /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: linker input file unused because linking not done [ 33%] Building CXX object common/richdem/CMakeFiles/richdem.dir/src/terrain_generation/PerlinNoise.cpp.o cd /path/to/WTM/build/common/richdem && /cm/local/apps/gcc/10.2.0/bin/c++ -Drichdem_EXPORTS -I/path/to/WTM/common/richdem/include -I/path/to/gdal-3.3.0/include -isystem -fPIC -DRICHDEM_GIT_HASH=\"3313b290725509d694da1fba83d0f32cca68cc70\" -DRICHDEM_COMPILE_TIME=\"2022-10-10T13:09:39Z\" -DUSEGDAL -Xpreprocessor -fopenmp /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++17 -MD -MT common/richdem/CMakeFiles/richdem.dir/src/terrain_generation/PerlinNoise.cpp.o -MF CMakeFiles/richdem.dir/src/terrain_generation/PerlinNoise.cpp.o.d -o CMakeFiles/richdem.dir/src/terrain_generation/PerlinNoise.cpp.o -c /path/to/WTM/common/richdem/src/terrain_generation/PerlinNoise.cpp c++: warning: /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: linker input file unused because linking not done [ 37%] Linking CXX shared library librichdem.so cd /path/to/WTM/build/common/richdem && /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_link_script CMakeFiles/richdem.dir/link.txt --verbose=1 /cm/local/apps/gcc/10.2.0/bin/c++ -fPIC -isystem -shared -Wl,-soname,librichdem.so -o librichdem.so CMakeFiles/richdem.dir/src/richdem.cpp.o CMakeFiles/richdem.dir/src/random.cpp.o CMakeFiles/richdem.dir/src/gdal.cpp.o CMakeFiles/richdem.dir/src/terrain_generation/terrain_generation.cpp.o CMakeFiles/richdem.dir/src/terrain_generation/PerlinNoise.cpp.o -Wl,-rpath,/path/to/gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib: /path/to/gdal-3.3.0/lib/libgdal.so /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 /lib/../lib64/crt1.o: In function `_start': (.text+0x24): undefined reference to `main' CMakeFiles/richdem.dir/src/random.cpp.o: In function `richdem::rand_engine()': random.cpp:(.text+0x45): undefined reference to `omp_get_thread_num' CMakeFiles/richdem.dir/src/random.cpp.o: In function `richdem::seed_rand(unsigned long)': random.cpp:(.text+0xb6): undefined reference to `GOMP_parallel' CMakeFiles/richdem.dir/src/random.cpp.o: In function `richdem::uniform_rand_int(int, int)': random.cpp:(.text+0x10c): undefined reference to `omp_get_thread_num' CMakeFiles/richdem.dir/src/random.cpp.o: In function `richdem::uniform_rand_real(double, double)': random.cpp:(.text+0x1cb): undefined reference to `omp_get_thread_num' CMakeFiles/richdem.dir/src/random.cpp.o: In function `richdem::normal_rand(double, double)': random.cpp:(.text+0x29e): undefined reference to `omp_get_thread_num' CMakeFiles/richdem.dir/src/random.cpp.o: In function `richdem::seed_rand(unsigned long) [clone ._omp_fn.0]': random.cpp:(.text+0x4a3): undefined reference to `GOMP_critical_start' random.cpp:(.text+0x4b1): undefined reference to `GOMP_critical_end' random.cpp:(.text+0x4c3): undefined reference to `omp_get_thread_num' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Comm_rank' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Get_address' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Comm_get_name' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Add_error_string' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Type_get_name' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Abort' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Alloc_mem' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Isend' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Barrier' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Allgather' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Reduce' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Send' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Init' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Type_size' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Accumulate' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Add_error_class' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Finalize' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Allgatherv' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Bcast' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Recv' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Request_free' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Allreduce' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `ompi_mpi_comm_world' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Sendrecv' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Add_error_code' /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: undefined reference to `PMPI_Win_get_name' collect2: error: ld returned 1 exit status make[2]: *** [common/richdem/CMakeFiles/richdem.dir/build.make:163: common/richdem/librichdem.so] Error 1 make[2]: Leaving directory '/path/to/WTM/build' make[1]: *** [CMakeFiles/Makefile2:306: common/richdem/CMakeFiles/richdem.dir/all] Error 2 make[1]: Leaving directory '/path/to/WTM/build' make: *** [Makefile:136: all] Error 2 > On Sun, Oct 9, 2022 at 9:28 PM Rob Kudyba wrote: > >> I did have -DMPI_CXX_COMPILER set, so I added -DCMAKE_C_COMPILER and now >> get these errors: >> >> [ 25%] Linking CXX shared library librichdem.so >> /lib/../lib64/crt1.o: In function `_start': >> (.text+0x24): undefined reference to `main' >> CMakeFiles/richdem.dir/src/random.cpp.o: In function >> `richdem::rand_engine()': >> random.cpp:(.text+0x45): undefined reference to `omp_get_thread_num' >> CMakeFiles/richdem.dir/src/random.cpp.o: In function >> `richdem::seed_rand(unsigned long)': >> random.cpp:(.text+0xb6): undefined reference to `GOMP_parallel' >> CMakeFiles/richdem.dir/src/random.cpp.o: In function >> `richdem::uniform_rand_int(int, int)': >> random.cpp:(.text+0x10c): undefined reference to `omp_get_thread_num' >> CMakeFiles/richdem.dir/src/random.cpp.o: In function >> `richdem::uniform_rand_real(double, double)': >> random.cpp:(.text+0x1cb): undefined reference to `omp_get_thread_num' >> CMakeFiles/richdem.dir/src/random.cpp.o: In function >> `richdem::normal_rand(double, double)': >> random.cpp:(.text+0x29e): undefined reference to `omp_get_thread_num' >> CMakeFiles/richdem.dir/src/random.cpp.o: In function >> `richdem::seed_rand(unsigned long) [clone ._omp_fn.0]': >> random.cpp:(.text+0x4a3): undefined reference to `GOMP_critical_start' >> random.cpp:(.text+0x4b1): undefined reference to `GOMP_critical_end' >> random.cpp:(.text+0x4c3): undefined reference to `omp_get_thread_num' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Comm_rank' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Get_address' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Comm_get_name' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Add_error_string' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Type_get_name' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Abort' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Alloc_mem' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Isend' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Barrier' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Allgather' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Reduce' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Send' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Init' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Type_size' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Accumulate' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Add_error_class' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Finalize' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Allgatherv' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Bcast' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Recv' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Request_free' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Allreduce' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `ompi_mpi_comm_world' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Sendrecv' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Add_error_code' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >> undefined reference to `PMPI_Win_get_name' >> collect2: error: ld returned 1 exit status >> make[2]: *** [common/richdem/CMakeFiles/richdem.dir/build.make:163: >> common/richdem/librichdem.so] Error 1 >> make[1]: *** [CMakeFiles/Makefile2:306: >> common/richdem/CMakeFiles/richdem.dir/all] Error 2 >> make: *** [Makefile:136: all] Error 2 >> >> I took a guess at using -DOpenMP_libomp_LIBRARY="/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0" >> as >> as otherwise I'd get: >> CMake Error at >> /path/to/cmake/cmake-3.22.1-linux-x86_64/share/cmake-3.22/Modules/FindPackageHandleStandardArgs.cmake:230 >> (message): >> Could NOT find OpenMP_CXX (missing: OpenMP_libomp_LIBRARY >> OpenMP_libomp_LIBRARY) (found version "4.5") >> >> So perhaps that's the real problem? >> >> On Sun, Oct 9, 2022 at 9:31 PM Junchao Zhang >> wrote: >> >>> In the last link step to generate the executable >>> /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra >>> -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x >>> -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/ >>> gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_ >>> support/lib:/path/to/petsc/arch-linux-cxx-debug/lib libwtm.a >>> common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 >>> common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so >>> >>> I did not find -lmpi to link in the mpi library. You can try to use cmake >>> -DCMAKE_C_COMPILER=/path/to/mpicc -DCMAKE_CXX_COMPILER=/path/to/mpicxx to >>> build your code >>> >>> On Sat, Oct 8, 2022 at 9:32 PM Rob Kudyba wrote: >>> >>>> Perhaps we can back one step: >>>>> Use your mpicc to build a "hello world" mpi test, then run it on a >>>>> compute node (with GPU) to see if it works. >>>>> If no, then your MPI environment has problems; >>>>> If yes, then use it to build petsc (turn on petsc's gpu support, >>>>> --with-cuda --with-cudac=nvcc), and then your code. >>>>> --Junchao Zhang >>>> >>>> OK tried this just to eliminate that the CUDA-capable OpenMPI is a >>>> factor: >>>> ./configure --with-debugging=0 --with-cmake=true --with-mpi=true >>>> --with-mpi-dir=/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support --with-fc=0 >>>> --with-cuda=1 >>>> [..] >>>> cuda: >>>> Version: 11.7 >>>> Includes: -I/path/to/cuda11.7/toolkit/11.7.1/include >>>> Libraries: -Wl,-rpath,/path/to/cuda11.7/toolkit/11.7.1/lib64 >>>> -L/cm/shared/apps/cuda11.7/toolkit/11.7.1/lib64 >>>> -L/path/to/cuda11.7/toolkit/11.7.1/lib64/stubs -lcudart -lnvToolsExt >>>> -lcufft -lcublas -lcusparse -lcusolver -lcurand -lcuda >>>> CUDA SM 75 >>>> CUDA underlying compiler: >>>> CUDA_CXX="/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/bin"/mpicxx >>>> CUDA underlying compiler flags: CUDA_CXXFLAGS= >>>> CUDA underlying linker libraries: CUDA_CXXLIBS= >>>> [...] >>>> Configure stage complete. Now build PETSc libraries with: >>>> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-opt all >>>> >>>> C++ compiler version: g++ (GCC) 10.2.0 >>>> Using C++ compiler to compile PETSc >>>> ----------------------------------------- >>>> Using C/C++ linker: >>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/bin/mpicxx >>>> Using C/C++ flags: -Wall -Wwrite-strings -Wno-strict-aliasing >>>> -Wno-unknown-pragmas -Wno-lto-type-mismatch -fstack-protector >>>> -fvisibility=hidden -g -O0 >>>> ----------------------------------------- >>>> Using system modules: >>>> shared:slurm/20.02.6:DefaultModules:openmpi/gcc/64/4.1.1_cuda_11.0.3_aware:gdal/3.3.0:cmake/3.22.1:cuda11.7/toolkit/11.7.1:openblas/dynamic/0.3.7:gcc/10.2.0 >>>> Using mpi.h: # 1 >>>> "/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include/mpi.h" 1 >>>> ----------------------------------------- >>>> Using libraries: -Wl,-rpath,/path/to/petsc/arch-linux-cxx-debug/lib >>>> -L/path/to/petsc/arch-linux-cxx-debug/lib -lpetsc -lopenblas -lm -lX11 >>>> -lquadmath -lstdc++ -ldl >>>> ------------------------------------------ >>>> Using mpiexec: mpiexec -mca orte_base_help_aggregate 0 -mca pml ucx >>>> --mca btl '^openib' >>>> ------------------------------------------ >>>> Using MAKE: /path/to/petsc/arch-linux-cxx-debug/bin/make >>>> Using MAKEFLAGS: -j24 -l48.0 --no-print-directory -- MPIEXEC=mpiexec\ >>>> -mca\ orte_base_help_aggregate\ 0\ \ -mca\ pml\ ucx\ --mca\ btl\ '^openib' >>>> PETSC_ARCH=arch-linux-cxx-debug PETSC_DIR=/path/to/petsc >>>> ========================================== >>>> make[3]: Nothing to be done for 'libs'. >>>> ========================================= >>>> Now to check if the libraries are working do: >>>> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-cxx-debug check >>>> ========================================= >>>> [me at xxx petsc]$ make PETSC_DIR=/path/to/petsc >>>> PETSC_ARCH=arch-linux-cxx-debug MPIEXEC="mpiexec -mca >>>> orte_base_help_aggregate 0 -mca pml ucx --mca btl '^openib'" check >>>> Running check examples to verify correct installation >>>> Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-cxx-debug >>>> C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI >>>> process >>>> C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI >>>> processes >>>> >>>> ./bandwidthTest >>>> [CUDA Bandwidth Test] - Starting... >>>> Running on... >>>> >>>> Device 0: Quadro RTX 8000 >>>> Quick Mode >>>> >>>> Host to Device Bandwidth, 1 Device(s) >>>> PINNED Memory Transfers >>>> Transfer Size (Bytes) Bandwidth(GB/s) >>>> 32000000 12.3 >>>> >>>> Device to Host Bandwidth, 1 Device(s) >>>> PINNED Memory Transfers >>>> Transfer Size (Bytes) Bandwidth(GB/s) >>>> 32000000 13.2 >>>> >>>> Device to Device Bandwidth, 1 Device(s) >>>> PINNED Memory Transfers >>>> Transfer Size (Bytes) Bandwidth(GB/s) >>>> 32000000 466.2 >>>> >>>> Result = PASS >>>> >>>> On Sat, Oct 8, 2022 at 7:56 PM Barry Smith wrote: >>>> >>>>> >>>>> True, but when users send reports back to us they will never have >>>>> used the VERBOSE=1 option, so it requires one more round trip of email to >>>>> get this additional information. >>>>> >>>>> > On Oct 8, 2022, at 6:48 PM, Jed Brown wrote: >>>>> > >>>>> > Barry Smith writes: >>>>> > >>>>> >> I hate these kinds of make rules that hide what the compiler is >>>>> doing (in the name of having less output, I guess) it makes it difficult to >>>>> figure out what is going wrong. >>>>> > >>>>> > You can make VERBOSE=1 with CMake-generated makefiles. >>>>> >>>> >>>> >>>>> Anyways, either some of the MPI libraries are missing from the link >>>>> line or they are in the wrong order and thus it is not able to search them >>>>> properly. Here is a bunch of discussions on why that error message can >>>>> appear >>>>> https://stackoverflow.com/questions/19901934/libpthread-so-0-error-adding-symbols-dso-missing-from-command-line >>>>> >>>> >>>> >>>> Still same but more noise and I have been using the suggestion of >>>> LDFLAGS="-Wl,--copy-dt-needed-entries" along with make: >>>> make[2]: Entering directory '/path/to/WTM/build' >>>> cd /path/to/WTM/build && >>>> /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_depends "Unix >>>> Makefiles" /path/to/WTM /path/to/WTM /path/to/WTM/build /path/to/WTM/build >>>> /path/to/WTM/build/CMakeFiles/wtm.x.dir/DependInfo.cmake --color= >>>> make[2]: Leaving directory '/path/to/WTM/build' >>>> make -f CMakeFiles/wtm.x.dir/build.make CMakeFiles/wtm.x.dir/build >>>> make[2]: Entering directory '/path/to/WTM/build' >>>> [ 66%] Building CXX object CMakeFiles/wtm.x.dir/src/WTM.cpp.o >>>> /cm/local/apps/gcc/10.2.0/bin/c++ >>>> -I/path/to/WTM/common/richdem/include -I/path/to/gdal-3.3.0/include >>>> -I/path/to/WTM/common/fmt/include -isystem >>>> /path/to/petsc/arch-linux-cxx-debug/include -isystem /path/to/petsc/include >>>> -isystem -O3 -g -Wall -Wextra -pedantic -Wshadow -Wfloat-conversion -Wall >>>> -Wextra -pedantic -Wshadow -DRICHDEM_GIT_HASH=\"xxx\" >>>> -DRICHDEM_COMPILE_TIME=\"2022-10-09T02:21:11Z\" -DUSEGDAL -Xpreprocessor >>>> -fopenmp >>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 >>>> -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++2a -MD >>>> -MT CMakeFiles/wtm.x.dir/src/WTM.cpp.o -MF >>>> CMakeFiles/wtm.x.dir/src/WTM.cpp.o.d -o CMakeFiles/wtm.x.dir/src/WTM.cpp.o >>>> -c /path/to/WTM/src/WTM.cpp >>>> c++: warning: >>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: >>>> linker input file unused because linking not done >>>> [ 70%] Linking CXX executable wtm.x >>>> /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_link_script >>>> CMakeFiles/wtm.x.dir/link.txt --verbose=1 >>>> /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra >>>> -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x >>>> -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib:/path/to/petsc/arch-linux-cxx-debug/lib >>>> libwtm.a common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so >>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 >>>> common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so >>>> /usr/bin/ld: CMakeFiles/wtm.x.dir/src/WTM.cpp.o: undefined reference to >>>> symbol 'ompi_mpi_comm_self' >>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40: error >>>> adding symbols: DSO missing from command line >>>> collect2: error: ld returned 1 exit status >>>> make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 >>>> make[2]: Leaving directory '/path/to/WTM/build' >>>> make[1]: *** [CMakeFiles/Makefile2:225: CMakeFiles/wtm.x.dir/all] Error >>>> 2 >>>> make[1]: Leaving directory '/path/to/WTM/build' >>>> make: *** [Makefile:136: all] Error 2 >>>> >>>> Anything stick out? >>>> >>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From rk3199 at columbia.edu Mon Oct 10 08:32:08 2022 From: rk3199 at columbia.edu (Rob Kudyba) Date: Mon, 10 Oct 2022 09:32:08 -0400 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: References: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> <16EE4635-0A03-45AA-92AD-1926907F4B8E@petsc.dev> <878rlqx779.fsf@jedbrown.org> <599F31B3-BE61-4928-871F-D773289D5497@petsc.dev> Message-ID: OK so I missed the OpenMP vs OpenMPI with incorrectly setting -DOpenMP_libomp_LIBRARY="/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib//libompitrace.so.40.30.0 So I changed it to point to /cm/local/apps/gcc/10.2.0/lib/libgomp.so.1.0.0 -- Found PETSc 3.18.0 CMake Error at /path/to/cmake/cmake-3.22.1-linux-x86_64/share/cmake-3.22/Modules/FindPackageHandleStandardArgs.cmake:230 (message): Could NOT find OpenMP_CXX (missing: OpenMP_libomp_LIBRARY OpenMP_libomp_LIBRARY) (found version "4.5") Call Stack (most recent call first): /path/to/cmake/cmake-3.22.1-linux-x86_64/share/cmake-3.22/Modules/FindPackageHandleStandardArgs.cmake:594 (_FPHSA_FAILURE_MESSAGE) /path/to/cmake/cmake-3.22.1-linux-x86_64/share/cmake-3.22/Modules/FindOpenMP.cmake:544 (find_package_handle_standard_args) common/richdem/CMakeLists.txt:12 (find_package) Perhaps I need to reach out to the richdem maintainer? On Mon, Oct 10, 2022 at 9:12 AM Rob Kudyba wrote: > OK, let's walk back and don't use -DCMAKE_C_COMPILER=/path/to/mpicc >> > Will do > > >> libompitrace.so.40.30.0 is not the OpenMP library; it is the tracing >> library for OpenMPI, https://github.com/open-mpi/ompi/issues/10036 >> > Does that mean I should remove this option in the cmake command? > > >> In your previous email, there was >> >> /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_link_script >> CMakeFiles/wtm.x.dir/link.txt --verbose=1 >> /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra >> -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x >> -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/ >> gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_ >> support/lib:/path/to/petsc/arch-linux-cxx-debug/lib libwtm.a >> common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 >> common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so >> /usr/bin/ld: CMakeFiles/wtm.x.dir/src/WTM.cpp.o: undefined reference to >> symbol 'ompi_mpi_comm_self' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40: error >> adding symbols: DSO missing from command line >> >> >> Let's try to add -lmpi (or /path/to/openmpi-4.1.1_ucx_ >> cuda_11.0.3_support/lib/libmpi.so) manually to see if it links >> >> /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra >> -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x >> -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/ >> gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_ >> support/lib:/path/to/petsc/arch-linux-cxx-debug/lib libwtm.a >> common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 >> common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so >> -lmpi >> > > so just adding that to the make command? Sttil seeing linking errors: > > make VERBOSE=1 LDFLAGS="-Wl,--copy-dt-needed-entries" -lmpi > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -S/path/to/WTM > -B/path/to/WTM/build --check-build-system CMakeFiles/Makefile.cmake 0 > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_progress_start > /path/to/WTM/build/CMakeFiles /path/to/WTM/build//CMakeFiles/progress.marks > make -f CMakeFiles/Makefile2 all > make[1]: Entering directory '/path/to/WTM/build' > make -f common/fmt/CMakeFiles/fmt.dir/build.make > common/fmt/CMakeFiles/fmt.dir/depend > make[2]: Entering directory '/path/to/WTM/build' > cd /path/to/WTM/build && > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_depends "Unix > Makefiles" /path/to/WTM /path/to/WTM/common/fmt /path/to/WTM/build > /path/to/WTM/build/common/fmt > /path/to/WTM/build/common/fmt/CMakeFiles/fmt.dir/DependInfo.cmake --color= > make[2]: Leaving directory '/path/to/WTM/build' > make -f common/fmt/CMakeFiles/fmt.dir/build.make > common/fmt/CMakeFiles/fmt.dir/build > make[2]: Entering directory '/path/to/WTM/build' > [ 4%] Building CXX object common/fmt/CMakeFiles/fmt.dir/src/format.cc.o > cd /path/to/WTM/build/common/fmt && /cm/local/apps/gcc/10.2.0/bin/c++ > -I/path/to/WTM/common/fmt/include -isystem -std=gnu++11 -MD -MT > common/fmt/CMakeFiles/fmt.dir/src/format.cc.o -MF > CMakeFiles/fmt.dir/src/format.cc.o.d -o CMakeFiles/fmt.dir/src/format.cc.o > -c /path/to/WTM/common/fmt/src/format.cc > [ 8%] Building CXX object common/fmt/CMakeFiles/fmt.dir/src/os.cc.o > cd /path/to/WTM/build/common/fmt && /cm/local/apps/gcc/10.2.0/bin/c++ > -I/path/to/WTM/common/fmt/include -isystem -std=gnu++11 -MD -MT > common/fmt/CMakeFiles/fmt.dir/src/os.cc.o -MF > CMakeFiles/fmt.dir/src/os.cc.o.d -o CMakeFiles/fmt.dir/src/os.cc.o -c > /path/to/WTM/common/fmt/src/os.cc > [ 12%] Linking CXX static library libfmt.a > cd /path/to/WTM/build/common/fmt && > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -P > CMakeFiles/fmt.dir/cmake_clean_target.cmake > cd /path/to/WTM/build/common/fmt && > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_link_script > CMakeFiles/fmt.dir/link.txt --verbose=1 > /usr/bin/ar qc libfmt.a CMakeFiles/fmt.dir/src/format.cc.o > CMakeFiles/fmt.dir/src/os.cc.o > /usr/bin/ranlib libfmt.a > make[2]: Leaving directory '/path/to/WTM/build' > [ 12%] Built target fmt > make -f common/richdem/CMakeFiles/richdem.dir/build.make > common/richdem/CMakeFiles/richdem.dir/depend > make[2]: Entering directory '/path/to/WTM/build' > cd /path/to/WTM/build && > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_depends "Unix > Makefiles" /path/to/WTM /path/to/WTM/common/richdem /path/to/WTM/build > /path/to/WTM/build/common/richdem > /path/to/WTM/build/common/richdem/CMakeFiles/richdem.dir/DependInfo.cmake > --color= > make[2]: Leaving directory '/path/to/WTM/build' > make -f common/richdem/CMakeFiles/richdem.dir/build.make > common/richdem/CMakeFiles/richdem.dir/build > make[2]: Entering directory '/path/to/WTM/build' > [ 16%] Building CXX object > common/richdem/CMakeFiles/richdem.dir/src/richdem.cpp.o > cd /path/to/WTM/build/common/richdem && /cm/local/apps/gcc/10.2.0/bin/c++ > -Drichdem_EXPORTS -I/path/to/WTM/common/richdem/include > -I/path/to/gdal-3.3.0/include -isystem -fPIC > -DRICHDEM_GIT_HASH=\"3313b290725509d694da1fba83d0f32cca68cc70\" > -DRICHDEM_COMPILE_TIME=\"2022-10-10T13:09:39Z\" -DUSEGDAL -Xpreprocessor > -fopenmp > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 > -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++17 -MD > -MT common/richdem/CMakeFiles/richdem.dir/src/richdem.cpp.o -MF > CMakeFiles/richdem.dir/src/richdem.cpp.o.d -o > CMakeFiles/richdem.dir/src/richdem.cpp.o -c > /path/to/WTM/common/richdem/src/richdem.cpp > c++: warning: > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: > linker input file unused because linking not done > [ 20%] Building CXX object > common/richdem/CMakeFiles/richdem.dir/src/random.cpp.o > cd /path/to/WTM/build/common/richdem && /cm/local/apps/gcc/10.2.0/bin/c++ > -Drichdem_EXPORTS -I/path/to/WTM/common/richdem/include > -I/path/to/gdal-3.3.0/include -isystem -fPIC > -DRICHDEM_GIT_HASH=\"3313b290725509d694da1fba83d0f32cca68cc70\" > -DRICHDEM_COMPILE_TIME=\"2022-10-10T13:09:39Z\" -DUSEGDAL -Xpreprocessor > -fopenmp > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 > -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++17 -MD > -MT common/richdem/CMakeFiles/richdem.dir/src/random.cpp.o -MF > CMakeFiles/richdem.dir/src/random.cpp.o.d -o > CMakeFiles/richdem.dir/src/random.cpp.o -c > /path/to/WTM/common/richdem/src/random.cpp > c++: warning: > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: > linker input file unused because linking not done > [ 25%] Building CXX object > common/richdem/CMakeFiles/richdem.dir/src/gdal.cpp.o > cd /path/to/WTM/build/common/richdem && /cm/local/apps/gcc/10.2.0/bin/c++ > -Drichdem_EXPORTS -I/path/to/WTM/common/richdem/include > -I/path/to/gdal-3.3.0/include -isystem -fPIC > -DRICHDEM_GIT_HASH=\"3313b290725509d694da1fba83d0f32cca68cc70\" > -DRICHDEM_COMPILE_TIME=\"2022-10-10T13:09:39Z\" -DUSEGDAL -Xpreprocessor > -fopenmp > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 > -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++17 -MD > -MT common/richdem/CMakeFiles/richdem.dir/src/gdal.cpp.o -MF > CMakeFiles/richdem.dir/src/gdal.cpp.o.d -o > CMakeFiles/richdem.dir/src/gdal.cpp.o -c > /path/to/WTM/common/richdem/src/gdal.cpp > c++: warning: > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: > linker input file unused because linking not done > [ 29%] Building CXX object > common/richdem/CMakeFiles/richdem.dir/src/terrain_generation/terrain_generation.cpp.o > cd /path/to/WTM/build/common/richdem && /cm/local/apps/gcc/10.2.0/bin/c++ > -Drichdem_EXPORTS -I/path/to/WTM/common/richdem/include > -I/path/to/gdal-3.3.0/include -isystem -fPIC > -DRICHDEM_GIT_HASH=\"3313b290725509d694da1fba83d0f32cca68cc70\" > -DRICHDEM_COMPILE_TIME=\"2022-10-10T13:09:39Z\" -DUSEGDAL -Xpreprocessor > -fopenmp > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 > -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++17 -MD > -MT > common/richdem/CMakeFiles/richdem.dir/src/terrain_generation/terrain_generation.cpp.o > -MF > CMakeFiles/richdem.dir/src/terrain_generation/terrain_generation.cpp.o.d -o > CMakeFiles/richdem.dir/src/terrain_generation/terrain_generation.cpp.o -c > /path/to/WTM/common/richdem/src/terrain_generation/terrain_generation.cpp > c++: warning: > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: > linker input file unused because linking not done > [ 33%] Building CXX object > common/richdem/CMakeFiles/richdem.dir/src/terrain_generation/PerlinNoise.cpp.o > cd /path/to/WTM/build/common/richdem && /cm/local/apps/gcc/10.2.0/bin/c++ > -Drichdem_EXPORTS -I/path/to/WTM/common/richdem/include > -I/path/to/gdal-3.3.0/include -isystem -fPIC > -DRICHDEM_GIT_HASH=\"3313b290725509d694da1fba83d0f32cca68cc70\" > -DRICHDEM_COMPILE_TIME=\"2022-10-10T13:09:39Z\" -DUSEGDAL -Xpreprocessor > -fopenmp > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 > -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++17 -MD > -MT > common/richdem/CMakeFiles/richdem.dir/src/terrain_generation/PerlinNoise.cpp.o > -MF CMakeFiles/richdem.dir/src/terrain_generation/PerlinNoise.cpp.o.d -o > CMakeFiles/richdem.dir/src/terrain_generation/PerlinNoise.cpp.o -c > /path/to/WTM/common/richdem/src/terrain_generation/PerlinNoise.cpp > c++: warning: > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: > linker input file unused because linking not done > [ 37%] Linking CXX shared library librichdem.so > cd /path/to/WTM/build/common/richdem && > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_link_script > CMakeFiles/richdem.dir/link.txt --verbose=1 > /cm/local/apps/gcc/10.2.0/bin/c++ -fPIC -isystem -shared > -Wl,-soname,librichdem.so -o librichdem.so > CMakeFiles/richdem.dir/src/richdem.cpp.o > CMakeFiles/richdem.dir/src/random.cpp.o > CMakeFiles/richdem.dir/src/gdal.cpp.o > CMakeFiles/richdem.dir/src/terrain_generation/terrain_generation.cpp.o > CMakeFiles/richdem.dir/src/terrain_generation/PerlinNoise.cpp.o > -Wl,-rpath,/path/to/gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib: > /path/to/gdal-3.3.0/lib/libgdal.so > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 > /lib/../lib64/crt1.o: In function `_start': > (.text+0x24): undefined reference to `main' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::rand_engine()': > random.cpp:(.text+0x45): undefined reference to `omp_get_thread_num' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::seed_rand(unsigned long)': > random.cpp:(.text+0xb6): undefined reference to `GOMP_parallel' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::uniform_rand_int(int, int)': > random.cpp:(.text+0x10c): undefined reference to `omp_get_thread_num' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::uniform_rand_real(double, double)': > random.cpp:(.text+0x1cb): undefined reference to `omp_get_thread_num' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::normal_rand(double, double)': > random.cpp:(.text+0x29e): undefined reference to `omp_get_thread_num' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::seed_rand(unsigned long) [clone ._omp_fn.0]': > random.cpp:(.text+0x4a3): undefined reference to `GOMP_critical_start' > random.cpp:(.text+0x4b1): undefined reference to `GOMP_critical_end' > random.cpp:(.text+0x4c3): undefined reference to `omp_get_thread_num' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Comm_rank' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Get_address' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Comm_get_name' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Add_error_string' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Type_get_name' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Abort' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Alloc_mem' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Isend' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Barrier' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Allgather' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Reduce' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Send' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Init' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Type_size' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Accumulate' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Add_error_class' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Finalize' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Allgatherv' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Bcast' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Recv' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Request_free' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Allreduce' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `ompi_mpi_comm_world' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Sendrecv' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Add_error_code' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Win_get_name' > collect2: error: ld returned 1 exit status > make[2]: *** [common/richdem/CMakeFiles/richdem.dir/build.make:163: > common/richdem/librichdem.so] Error 1 > make[2]: Leaving directory '/path/to/WTM/build' > make[1]: *** [CMakeFiles/Makefile2:306: > common/richdem/CMakeFiles/richdem.dir/all] Error 2 > make[1]: Leaving directory '/path/to/WTM/build' > make: *** [Makefile:136: all] Error 2 > > > >> On Sun, Oct 9, 2022 at 9:28 PM Rob Kudyba wrote: >> >>> I did have -DMPI_CXX_COMPILER set, so I added -DCMAKE_C_COMPILER and >>> now get these errors: >>> >>> [ 25%] Linking CXX shared library librichdem.so >>> /lib/../lib64/crt1.o: In function `_start': >>> (.text+0x24): undefined reference to `main' >>> CMakeFiles/richdem.dir/src/random.cpp.o: In function >>> `richdem::rand_engine()': >>> random.cpp:(.text+0x45): undefined reference to `omp_get_thread_num' >>> CMakeFiles/richdem.dir/src/random.cpp.o: In function >>> `richdem::seed_rand(unsigned long)': >>> random.cpp:(.text+0xb6): undefined reference to `GOMP_parallel' >>> CMakeFiles/richdem.dir/src/random.cpp.o: In function >>> `richdem::uniform_rand_int(int, int)': >>> random.cpp:(.text+0x10c): undefined reference to `omp_get_thread_num' >>> CMakeFiles/richdem.dir/src/random.cpp.o: In function >>> `richdem::uniform_rand_real(double, double)': >>> random.cpp:(.text+0x1cb): undefined reference to `omp_get_thread_num' >>> CMakeFiles/richdem.dir/src/random.cpp.o: In function >>> `richdem::normal_rand(double, double)': >>> random.cpp:(.text+0x29e): undefined reference to `omp_get_thread_num' >>> CMakeFiles/richdem.dir/src/random.cpp.o: In function >>> `richdem::seed_rand(unsigned long) [clone ._omp_fn.0]': >>> random.cpp:(.text+0x4a3): undefined reference to `GOMP_critical_start' >>> random.cpp:(.text+0x4b1): undefined reference to `GOMP_critical_end' >>> random.cpp:(.text+0x4c3): undefined reference to `omp_get_thread_num' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Comm_rank' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Get_address' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Comm_get_name' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Add_error_string' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Type_get_name' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Abort' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Alloc_mem' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Isend' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Barrier' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Allgather' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Reduce' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Send' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Init' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Type_size' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Accumulate' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Add_error_class' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Finalize' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Allgatherv' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Bcast' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Recv' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Request_free' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Allreduce' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `ompi_mpi_comm_world' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Sendrecv' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Add_error_code' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Win_get_name' >>> collect2: error: ld returned 1 exit status >>> make[2]: *** [common/richdem/CMakeFiles/richdem.dir/build.make:163: >>> common/richdem/librichdem.so] Error 1 >>> make[1]: *** [CMakeFiles/Makefile2:306: >>> common/richdem/CMakeFiles/richdem.dir/all] Error 2 >>> make: *** [Makefile:136: all] Error 2 >>> >>> I took a guess at using -DOpenMP_libomp_LIBRARY="/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0" >>> as >>> as otherwise I'd get: >>> CMake Error at >>> /path/to/cmake/cmake-3.22.1-linux-x86_64/share/cmake-3.22/Modules/FindPackageHandleStandardArgs.cmake:230 >>> (message): >>> Could NOT find OpenMP_CXX (missing: OpenMP_libomp_LIBRARY >>> OpenMP_libomp_LIBRARY) (found version "4.5") >>> >>> So perhaps that's the real problem? >>> >>> On Sun, Oct 9, 2022 at 9:31 PM Junchao Zhang >>> wrote: >>> >>>> In the last link step to generate the executable >>>> /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra >>>> -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x >>>> -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/ >>>> gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_ >>>> support/lib:/path/to/petsc/arch-linux-cxx-debug/lib libwtm.a >>>> common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so >>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 >>>> common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so >>>> >>>> I did not find -lmpi to link in the mpi library. You can try to use cmake >>>> -DCMAKE_C_COMPILER=/path/to/mpicc -DCMAKE_CXX_COMPILER=/path/to/mpicxx to >>>> build your code >>>> >>>> On Sat, Oct 8, 2022 at 9:32 PM Rob Kudyba wrote: >>>> >>>>> Perhaps we can back one step: >>>>>> Use your mpicc to build a "hello world" mpi test, then run it on a >>>>>> compute node (with GPU) to see if it works. >>>>>> If no, then your MPI environment has problems; >>>>>> If yes, then use it to build petsc (turn on petsc's gpu support, >>>>>> --with-cuda --with-cudac=nvcc), and then your code. >>>>>> --Junchao Zhang >>>>> >>>>> OK tried this just to eliminate that the CUDA-capable OpenMPI is a >>>>> factor: >>>>> ./configure --with-debugging=0 --with-cmake=true --with-mpi=true >>>>> --with-mpi-dir=/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support --with-fc=0 >>>>> --with-cuda=1 >>>>> [..] >>>>> cuda: >>>>> Version: 11.7 >>>>> Includes: -I/path/to/cuda11.7/toolkit/11.7.1/include >>>>> Libraries: -Wl,-rpath,/path/to/cuda11.7/toolkit/11.7.1/lib64 >>>>> -L/cm/shared/apps/cuda11.7/toolkit/11.7.1/lib64 >>>>> -L/path/to/cuda11.7/toolkit/11.7.1/lib64/stubs -lcudart -lnvToolsExt >>>>> -lcufft -lcublas -lcusparse -lcusolver -lcurand -lcuda >>>>> CUDA SM 75 >>>>> CUDA underlying compiler: >>>>> CUDA_CXX="/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/bin"/mpicxx >>>>> CUDA underlying compiler flags: CUDA_CXXFLAGS= >>>>> CUDA underlying linker libraries: CUDA_CXXLIBS= >>>>> [...] >>>>> Configure stage complete. Now build PETSc libraries with: >>>>> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-opt all >>>>> >>>>> C++ compiler version: g++ (GCC) 10.2.0 >>>>> Using C++ compiler to compile PETSc >>>>> ----------------------------------------- >>>>> Using C/C++ linker: >>>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/bin/mpicxx >>>>> Using C/C++ flags: -Wall -Wwrite-strings -Wno-strict-aliasing >>>>> -Wno-unknown-pragmas -Wno-lto-type-mismatch -fstack-protector >>>>> -fvisibility=hidden -g -O0 >>>>> ----------------------------------------- >>>>> Using system modules: >>>>> shared:slurm/20.02.6:DefaultModules:openmpi/gcc/64/4.1.1_cuda_11.0.3_aware:gdal/3.3.0:cmake/3.22.1:cuda11.7/toolkit/11.7.1:openblas/dynamic/0.3.7:gcc/10.2.0 >>>>> Using mpi.h: # 1 >>>>> "/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include/mpi.h" 1 >>>>> ----------------------------------------- >>>>> Using libraries: -Wl,-rpath,/path/to/petsc/arch-linux-cxx-debug/lib >>>>> -L/path/to/petsc/arch-linux-cxx-debug/lib -lpetsc -lopenblas -lm -lX11 >>>>> -lquadmath -lstdc++ -ldl >>>>> ------------------------------------------ >>>>> Using mpiexec: mpiexec -mca orte_base_help_aggregate 0 -mca pml ucx >>>>> --mca btl '^openib' >>>>> ------------------------------------------ >>>>> Using MAKE: /path/to/petsc/arch-linux-cxx-debug/bin/make >>>>> Using MAKEFLAGS: -j24 -l48.0 --no-print-directory -- MPIEXEC=mpiexec\ >>>>> -mca\ orte_base_help_aggregate\ 0\ \ -mca\ pml\ ucx\ --mca\ btl\ '^openib' >>>>> PETSC_ARCH=arch-linux-cxx-debug PETSC_DIR=/path/to/petsc >>>>> ========================================== >>>>> make[3]: Nothing to be done for 'libs'. >>>>> ========================================= >>>>> Now to check if the libraries are working do: >>>>> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-cxx-debug check >>>>> ========================================= >>>>> [me at xxx petsc]$ make PETSC_DIR=/path/to/petsc >>>>> PETSC_ARCH=arch-linux-cxx-debug MPIEXEC="mpiexec -mca >>>>> orte_base_help_aggregate 0 -mca pml ucx --mca btl '^openib'" check >>>>> Running check examples to verify correct installation >>>>> Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-cxx-debug >>>>> C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI >>>>> process >>>>> C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI >>>>> processes >>>>> >>>>> ./bandwidthTest >>>>> [CUDA Bandwidth Test] - Starting... >>>>> Running on... >>>>> >>>>> Device 0: Quadro RTX 8000 >>>>> Quick Mode >>>>> >>>>> Host to Device Bandwidth, 1 Device(s) >>>>> PINNED Memory Transfers >>>>> Transfer Size (Bytes) Bandwidth(GB/s) >>>>> 32000000 12.3 >>>>> >>>>> Device to Host Bandwidth, 1 Device(s) >>>>> PINNED Memory Transfers >>>>> Transfer Size (Bytes) Bandwidth(GB/s) >>>>> 32000000 13.2 >>>>> >>>>> Device to Device Bandwidth, 1 Device(s) >>>>> PINNED Memory Transfers >>>>> Transfer Size (Bytes) Bandwidth(GB/s) >>>>> 32000000 466.2 >>>>> >>>>> Result = PASS >>>>> >>>>> On Sat, Oct 8, 2022 at 7:56 PM Barry Smith wrote: >>>>> >>>>>> >>>>>> True, but when users send reports back to us they will never have >>>>>> used the VERBOSE=1 option, so it requires one more round trip of email to >>>>>> get this additional information. >>>>>> >>>>>> > On Oct 8, 2022, at 6:48 PM, Jed Brown wrote: >>>>>> > >>>>>> > Barry Smith writes: >>>>>> > >>>>>> >> I hate these kinds of make rules that hide what the compiler is >>>>>> doing (in the name of having less output, I guess) it makes it difficult to >>>>>> figure out what is going wrong. >>>>>> > >>>>>> > You can make VERBOSE=1 with CMake-generated makefiles. >>>>>> >>>>> >>>>> >>>>>> Anyways, either some of the MPI libraries are missing from the link >>>>>> line or they are in the wrong order and thus it is not able to search them >>>>>> properly. Here is a bunch of discussions on why that error message can >>>>>> appear >>>>>> https://stackoverflow.com/questions/19901934/libpthread-so-0-error-adding-symbols-dso-missing-from-command-line >>>>>> >>>>> >>>>> >>>>> Still same but more noise and I have been using the suggestion of >>>>> LDFLAGS="-Wl,--copy-dt-needed-entries" along with make: >>>>> make[2]: Entering directory '/path/to/WTM/build' >>>>> cd /path/to/WTM/build && >>>>> /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_depends "Unix >>>>> Makefiles" /path/to/WTM /path/to/WTM /path/to/WTM/build /path/to/WTM/build >>>>> /path/to/WTM/build/CMakeFiles/wtm.x.dir/DependInfo.cmake --color= >>>>> make[2]: Leaving directory '/path/to/WTM/build' >>>>> make -f CMakeFiles/wtm.x.dir/build.make CMakeFiles/wtm.x.dir/build >>>>> make[2]: Entering directory '/path/to/WTM/build' >>>>> [ 66%] Building CXX object CMakeFiles/wtm.x.dir/src/WTM.cpp.o >>>>> /cm/local/apps/gcc/10.2.0/bin/c++ >>>>> -I/path/to/WTM/common/richdem/include -I/path/to/gdal-3.3.0/include >>>>> -I/path/to/WTM/common/fmt/include -isystem >>>>> /path/to/petsc/arch-linux-cxx-debug/include -isystem /path/to/petsc/include >>>>> -isystem -O3 -g -Wall -Wextra -pedantic -Wshadow -Wfloat-conversion -Wall >>>>> -Wextra -pedantic -Wshadow -DRICHDEM_GIT_HASH=\"xxx\" >>>>> -DRICHDEM_COMPILE_TIME=\"2022-10-09T02:21:11Z\" -DUSEGDAL -Xpreprocessor >>>>> -fopenmp >>>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 >>>>> -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++2a -MD >>>>> -MT CMakeFiles/wtm.x.dir/src/WTM.cpp.o -MF >>>>> CMakeFiles/wtm.x.dir/src/WTM.cpp.o.d -o CMakeFiles/wtm.x.dir/src/WTM.cpp.o >>>>> -c /path/to/WTM/src/WTM.cpp >>>>> c++: warning: >>>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: >>>>> linker input file unused because linking not done >>>>> [ 70%] Linking CXX executable wtm.x >>>>> /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E >>>>> cmake_link_script CMakeFiles/wtm.x.dir/link.txt --verbose=1 >>>>> /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra >>>>> -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x >>>>> -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib:/path/to/petsc/arch-linux-cxx-debug/lib >>>>> libwtm.a common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so >>>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 >>>>> common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so >>>>> /usr/bin/ld: CMakeFiles/wtm.x.dir/src/WTM.cpp.o: undefined reference >>>>> to symbol 'ompi_mpi_comm_self' >>>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40: error >>>>> adding symbols: DSO missing from command line >>>>> collect2: error: ld returned 1 exit status >>>>> make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 >>>>> make[2]: Leaving directory '/path/to/WTM/build' >>>>> make[1]: *** [CMakeFiles/Makefile2:225: CMakeFiles/wtm.x.dir/all] >>>>> Error 2 >>>>> make[1]: Leaving directory '/path/to/WTM/build' >>>>> make: *** [Makefile:136: all] Error 2 >>>>> >>>>> Anything stick out? >>>>> >>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Mon Oct 10 08:52:42 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Mon, 10 Oct 2022 08:52:42 -0500 Subject: [petsc-users] suppress CUDA warning & choose MCA parameter for mpirun during make PETSC_ARCH=arch-linux-c-debug check In-Reply-To: References: <39e71ae6-e943-c558-44af-0992089d6151@mcs.anl.gov> <16EE4635-0A03-45AA-92AD-1926907F4B8E@petsc.dev> <878rlqx779.fsf@jedbrown.org> <599F31B3-BE61-4928-871F-D773289D5497@petsc.dev> Message-ID: On Mon, Oct 10, 2022 at 8:13 AM Rob Kudyba wrote: > OK, let's walk back and don't use -DCMAKE_C_COMPILER=/path/to/mpicc >> > Will do > > >> libompitrace.so.40.30.0 is not the OpenMP library; it is the tracing >> library for OpenMPI, https://github.com/open-mpi/ompi/issues/10036 >> > Does that mean I should remove this option in the cmake command? > > >> In your previous email, there was >> >> /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_link_script >> CMakeFiles/wtm.x.dir/link.txt --verbose=1 >> /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra >> -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x >> -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/ >> gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_ >> support/lib:/path/to/petsc/arch-linux-cxx-debug/lib libwtm.a >> common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 >> common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so >> /usr/bin/ld: CMakeFiles/wtm.x.dir/src/WTM.cpp.o: undefined reference to >> symbol 'ompi_mpi_comm_self' >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40: error >> adding symbols: DSO missing from command line >> >> >> Let's try to add -lmpi (or /path/to/openmpi-4.1.1_ucx_ >> cuda_11.0.3_support/lib/libmpi.so) manually to see if it links >> >> /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra >> -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x >> -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/ >> gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_ >> support/lib:/path/to/petsc/arch-linux-cxx-debug/lib libwtm.a >> common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so >> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 >> common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so >> -lmpi >> > > so just adding that to the make command? Sttil seeing linking errors: > > make VERBOSE=1 LDFLAGS="-Wl,--copy-dt-needed-entries" -lmpi > make VERBOSE=1 LDFLAGS="-Wl,--copy-dt-needed-entries -lmpi" /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -S/path/to/WTM > -B/path/to/WTM/build --check-build-system CMakeFiles/Makefile.cmake 0 > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_progress_start > /path/to/WTM/build/CMakeFiles /path/to/WTM/build//CMakeFiles/progress.marks > make -f CMakeFiles/Makefile2 all > make[1]: Entering directory '/path/to/WTM/build' > make -f common/fmt/CMakeFiles/fmt.dir/build.make > common/fmt/CMakeFiles/fmt.dir/depend > make[2]: Entering directory '/path/to/WTM/build' > cd /path/to/WTM/build && > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_depends "Unix > Makefiles" /path/to/WTM /path/to/WTM/common/fmt /path/to/WTM/build > /path/to/WTM/build/common/fmt > /path/to/WTM/build/common/fmt/CMakeFiles/fmt.dir/DependInfo.cmake --color= > make[2]: Leaving directory '/path/to/WTM/build' > make -f common/fmt/CMakeFiles/fmt.dir/build.make > common/fmt/CMakeFiles/fmt.dir/build > make[2]: Entering directory '/path/to/WTM/build' > [ 4%] Building CXX object common/fmt/CMakeFiles/fmt.dir/src/format.cc.o > cd /path/to/WTM/build/common/fmt && /cm/local/apps/gcc/10.2.0/bin/c++ > -I/path/to/WTM/common/fmt/include -isystem -std=gnu++11 -MD -MT > common/fmt/CMakeFiles/fmt.dir/src/format.cc.o -MF > CMakeFiles/fmt.dir/src/format.cc.o.d -o CMakeFiles/fmt.dir/src/format.cc.o > -c /path/to/WTM/common/fmt/src/format.cc > [ 8%] Building CXX object common/fmt/CMakeFiles/fmt.dir/src/os.cc.o > cd /path/to/WTM/build/common/fmt && /cm/local/apps/gcc/10.2.0/bin/c++ > -I/path/to/WTM/common/fmt/include -isystem -std=gnu++11 -MD -MT > common/fmt/CMakeFiles/fmt.dir/src/os.cc.o -MF > CMakeFiles/fmt.dir/src/os.cc.o.d -o CMakeFiles/fmt.dir/src/os.cc.o -c > /path/to/WTM/common/fmt/src/os.cc > [ 12%] Linking CXX static library libfmt.a > cd /path/to/WTM/build/common/fmt && > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -P > CMakeFiles/fmt.dir/cmake_clean_target.cmake > cd /path/to/WTM/build/common/fmt && > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_link_script > CMakeFiles/fmt.dir/link.txt --verbose=1 > /usr/bin/ar qc libfmt.a CMakeFiles/fmt.dir/src/format.cc.o > CMakeFiles/fmt.dir/src/os.cc.o > /usr/bin/ranlib libfmt.a > make[2]: Leaving directory '/path/to/WTM/build' > [ 12%] Built target fmt > make -f common/richdem/CMakeFiles/richdem.dir/build.make > common/richdem/CMakeFiles/richdem.dir/depend > make[2]: Entering directory '/path/to/WTM/build' > cd /path/to/WTM/build && > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_depends "Unix > Makefiles" /path/to/WTM /path/to/WTM/common/richdem /path/to/WTM/build > /path/to/WTM/build/common/richdem > /path/to/WTM/build/common/richdem/CMakeFiles/richdem.dir/DependInfo.cmake > --color= > make[2]: Leaving directory '/path/to/WTM/build' > make -f common/richdem/CMakeFiles/richdem.dir/build.make > common/richdem/CMakeFiles/richdem.dir/build > make[2]: Entering directory '/path/to/WTM/build' > [ 16%] Building CXX object > common/richdem/CMakeFiles/richdem.dir/src/richdem.cpp.o > cd /path/to/WTM/build/common/richdem && /cm/local/apps/gcc/10.2.0/bin/c++ > -Drichdem_EXPORTS -I/path/to/WTM/common/richdem/include > -I/path/to/gdal-3.3.0/include -isystem -fPIC > -DRICHDEM_GIT_HASH=\"3313b290725509d694da1fba83d0f32cca68cc70\" > -DRICHDEM_COMPILE_TIME=\"2022-10-10T13:09:39Z\" -DUSEGDAL -Xpreprocessor > -fopenmp > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 > -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++17 -MD > -MT common/richdem/CMakeFiles/richdem.dir/src/richdem.cpp.o -MF > CMakeFiles/richdem.dir/src/richdem.cpp.o.d -o > CMakeFiles/richdem.dir/src/richdem.cpp.o -c > /path/to/WTM/common/richdem/src/richdem.cpp > c++: warning: > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: > linker input file unused because linking not done > [ 20%] Building CXX object > common/richdem/CMakeFiles/richdem.dir/src/random.cpp.o > cd /path/to/WTM/build/common/richdem && /cm/local/apps/gcc/10.2.0/bin/c++ > -Drichdem_EXPORTS -I/path/to/WTM/common/richdem/include > -I/path/to/gdal-3.3.0/include -isystem -fPIC > -DRICHDEM_GIT_HASH=\"3313b290725509d694da1fba83d0f32cca68cc70\" > -DRICHDEM_COMPILE_TIME=\"2022-10-10T13:09:39Z\" -DUSEGDAL -Xpreprocessor > -fopenmp > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 > -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++17 -MD > -MT common/richdem/CMakeFiles/richdem.dir/src/random.cpp.o -MF > CMakeFiles/richdem.dir/src/random.cpp.o.d -o > CMakeFiles/richdem.dir/src/random.cpp.o -c > /path/to/WTM/common/richdem/src/random.cpp > c++: warning: > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: > linker input file unused because linking not done > [ 25%] Building CXX object > common/richdem/CMakeFiles/richdem.dir/src/gdal.cpp.o > cd /path/to/WTM/build/common/richdem && /cm/local/apps/gcc/10.2.0/bin/c++ > -Drichdem_EXPORTS -I/path/to/WTM/common/richdem/include > -I/path/to/gdal-3.3.0/include -isystem -fPIC > -DRICHDEM_GIT_HASH=\"3313b290725509d694da1fba83d0f32cca68cc70\" > -DRICHDEM_COMPILE_TIME=\"2022-10-10T13:09:39Z\" -DUSEGDAL -Xpreprocessor > -fopenmp > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 > -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++17 -MD > -MT common/richdem/CMakeFiles/richdem.dir/src/gdal.cpp.o -MF > CMakeFiles/richdem.dir/src/gdal.cpp.o.d -o > CMakeFiles/richdem.dir/src/gdal.cpp.o -c > /path/to/WTM/common/richdem/src/gdal.cpp > c++: warning: > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: > linker input file unused because linking not done > [ 29%] Building CXX object > common/richdem/CMakeFiles/richdem.dir/src/terrain_generation/terrain_generation.cpp.o > cd /path/to/WTM/build/common/richdem && /cm/local/apps/gcc/10.2.0/bin/c++ > -Drichdem_EXPORTS -I/path/to/WTM/common/richdem/include > -I/path/to/gdal-3.3.0/include -isystem -fPIC > -DRICHDEM_GIT_HASH=\"3313b290725509d694da1fba83d0f32cca68cc70\" > -DRICHDEM_COMPILE_TIME=\"2022-10-10T13:09:39Z\" -DUSEGDAL -Xpreprocessor > -fopenmp > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 > -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++17 -MD > -MT > common/richdem/CMakeFiles/richdem.dir/src/terrain_generation/terrain_generation.cpp.o > -MF > CMakeFiles/richdem.dir/src/terrain_generation/terrain_generation.cpp.o.d -o > CMakeFiles/richdem.dir/src/terrain_generation/terrain_generation.cpp.o -c > /path/to/WTM/common/richdem/src/terrain_generation/terrain_generation.cpp > c++: warning: > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: > linker input file unused because linking not done > [ 33%] Building CXX object > common/richdem/CMakeFiles/richdem.dir/src/terrain_generation/PerlinNoise.cpp.o > cd /path/to/WTM/build/common/richdem && /cm/local/apps/gcc/10.2.0/bin/c++ > -Drichdem_EXPORTS -I/path/to/WTM/common/richdem/include > -I/path/to/gdal-3.3.0/include -isystem -fPIC > -DRICHDEM_GIT_HASH=\"3313b290725509d694da1fba83d0f32cca68cc70\" > -DRICHDEM_COMPILE_TIME=\"2022-10-10T13:09:39Z\" -DUSEGDAL -Xpreprocessor > -fopenmp > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 > -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++17 -MD > -MT > common/richdem/CMakeFiles/richdem.dir/src/terrain_generation/PerlinNoise.cpp.o > -MF CMakeFiles/richdem.dir/src/terrain_generation/PerlinNoise.cpp.o.d -o > CMakeFiles/richdem.dir/src/terrain_generation/PerlinNoise.cpp.o -c > /path/to/WTM/common/richdem/src/terrain_generation/PerlinNoise.cpp > c++: warning: > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: > linker input file unused because linking not done > [ 37%] Linking CXX shared library librichdem.so > cd /path/to/WTM/build/common/richdem && > /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_link_script > CMakeFiles/richdem.dir/link.txt --verbose=1 > /cm/local/apps/gcc/10.2.0/bin/c++ -fPIC -isystem -shared > -Wl,-soname,librichdem.so -o librichdem.so > CMakeFiles/richdem.dir/src/richdem.cpp.o > CMakeFiles/richdem.dir/src/random.cpp.o > CMakeFiles/richdem.dir/src/gdal.cpp.o > CMakeFiles/richdem.dir/src/terrain_generation/terrain_generation.cpp.o > CMakeFiles/richdem.dir/src/terrain_generation/PerlinNoise.cpp.o > -Wl,-rpath,/path/to/gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib: > /path/to/gdal-3.3.0/lib/libgdal.so > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 > /lib/../lib64/crt1.o: In function `_start': > (.text+0x24): undefined reference to `main' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::rand_engine()': > random.cpp:(.text+0x45): undefined reference to `omp_get_thread_num' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::seed_rand(unsigned long)': > random.cpp:(.text+0xb6): undefined reference to `GOMP_parallel' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::uniform_rand_int(int, int)': > random.cpp:(.text+0x10c): undefined reference to `omp_get_thread_num' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::uniform_rand_real(double, double)': > random.cpp:(.text+0x1cb): undefined reference to `omp_get_thread_num' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::normal_rand(double, double)': > random.cpp:(.text+0x29e): undefined reference to `omp_get_thread_num' > CMakeFiles/richdem.dir/src/random.cpp.o: In function > `richdem::seed_rand(unsigned long) [clone ._omp_fn.0]': > random.cpp:(.text+0x4a3): undefined reference to `GOMP_critical_start' > random.cpp:(.text+0x4b1): undefined reference to `GOMP_critical_end' > random.cpp:(.text+0x4c3): undefined reference to `omp_get_thread_num' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Comm_rank' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Get_address' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Comm_get_name' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Add_error_string' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Type_get_name' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Abort' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Alloc_mem' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Isend' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Barrier' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Allgather' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Reduce' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Send' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Init' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Type_size' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Accumulate' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Add_error_class' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Finalize' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Allgatherv' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Bcast' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Recv' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Request_free' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Allreduce' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `ompi_mpi_comm_world' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Sendrecv' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Add_error_code' > /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: > undefined reference to `PMPI_Win_get_name' > collect2: error: ld returned 1 exit status > make[2]: *** [common/richdem/CMakeFiles/richdem.dir/build.make:163: > common/richdem/librichdem.so] Error 1 > make[2]: Leaving directory '/path/to/WTM/build' > make[1]: *** [CMakeFiles/Makefile2:306: > common/richdem/CMakeFiles/richdem.dir/all] Error 2 > make[1]: Leaving directory '/path/to/WTM/build' > make: *** [Makefile:136: all] Error 2 > > > >> On Sun, Oct 9, 2022 at 9:28 PM Rob Kudyba wrote: >> >>> I did have -DMPI_CXX_COMPILER set, so I added -DCMAKE_C_COMPILER and >>> now get these errors: >>> >>> [ 25%] Linking CXX shared library librichdem.so >>> /lib/../lib64/crt1.o: In function `_start': >>> (.text+0x24): undefined reference to `main' >>> CMakeFiles/richdem.dir/src/random.cpp.o: In function >>> `richdem::rand_engine()': >>> random.cpp:(.text+0x45): undefined reference to `omp_get_thread_num' >>> CMakeFiles/richdem.dir/src/random.cpp.o: In function >>> `richdem::seed_rand(unsigned long)': >>> random.cpp:(.text+0xb6): undefined reference to `GOMP_parallel' >>> CMakeFiles/richdem.dir/src/random.cpp.o: In function >>> `richdem::uniform_rand_int(int, int)': >>> random.cpp:(.text+0x10c): undefined reference to `omp_get_thread_num' >>> CMakeFiles/richdem.dir/src/random.cpp.o: In function >>> `richdem::uniform_rand_real(double, double)': >>> random.cpp:(.text+0x1cb): undefined reference to `omp_get_thread_num' >>> CMakeFiles/richdem.dir/src/random.cpp.o: In function >>> `richdem::normal_rand(double, double)': >>> random.cpp:(.text+0x29e): undefined reference to `omp_get_thread_num' >>> CMakeFiles/richdem.dir/src/random.cpp.o: In function >>> `richdem::seed_rand(unsigned long) [clone ._omp_fn.0]': >>> random.cpp:(.text+0x4a3): undefined reference to `GOMP_critical_start' >>> random.cpp:(.text+0x4b1): undefined reference to `GOMP_critical_end' >>> random.cpp:(.text+0x4c3): undefined reference to `omp_get_thread_num' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Comm_rank' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Get_address' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Comm_get_name' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Add_error_string' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Type_get_name' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Abort' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Alloc_mem' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Isend' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Barrier' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Allgather' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Reduce' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Send' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Init' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Type_size' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Accumulate' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Add_error_class' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Finalize' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Allgatherv' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Bcast' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Recv' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Request_free' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Allreduce' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `ompi_mpi_comm_world' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Sendrecv' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Add_error_code' >>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0: >>> undefined reference to `PMPI_Win_get_name' >>> collect2: error: ld returned 1 exit status >>> make[2]: *** [common/richdem/CMakeFiles/richdem.dir/build.make:163: >>> common/richdem/librichdem.so] Error 1 >>> make[1]: *** [CMakeFiles/Makefile2:306: >>> common/richdem/CMakeFiles/richdem.dir/all] Error 2 >>> make: *** [Makefile:136: all] Error 2 >>> >>> I took a guess at using -DOpenMP_libomp_LIBRARY="/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0" >>> as >>> as otherwise I'd get: >>> CMake Error at >>> /path/to/cmake/cmake-3.22.1-linux-x86_64/share/cmake-3.22/Modules/FindPackageHandleStandardArgs.cmake:230 >>> (message): >>> Could NOT find OpenMP_CXX (missing: OpenMP_libomp_LIBRARY >>> OpenMP_libomp_LIBRARY) (found version "4.5") >>> >>> So perhaps that's the real problem? >>> >>> On Sun, Oct 9, 2022 at 9:31 PM Junchao Zhang >>> wrote: >>> >>>> In the last link step to generate the executable >>>> /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra >>>> -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x >>>> -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/ >>>> gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_ >>>> support/lib:/path/to/petsc/arch-linux-cxx-debug/lib libwtm.a >>>> common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so >>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 >>>> common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so >>>> >>>> I did not find -lmpi to link in the mpi library. You can try to use cmake >>>> -DCMAKE_C_COMPILER=/path/to/mpicc -DCMAKE_CXX_COMPILER=/path/to/mpicxx to >>>> build your code >>>> >>>> On Sat, Oct 8, 2022 at 9:32 PM Rob Kudyba wrote: >>>> >>>>> Perhaps we can back one step: >>>>>> Use your mpicc to build a "hello world" mpi test, then run it on a >>>>>> compute node (with GPU) to see if it works. >>>>>> If no, then your MPI environment has problems; >>>>>> If yes, then use it to build petsc (turn on petsc's gpu support, >>>>>> --with-cuda --with-cudac=nvcc), and then your code. >>>>>> --Junchao Zhang >>>>> >>>>> OK tried this just to eliminate that the CUDA-capable OpenMPI is a >>>>> factor: >>>>> ./configure --with-debugging=0 --with-cmake=true --with-mpi=true >>>>> --with-mpi-dir=/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support --with-fc=0 >>>>> --with-cuda=1 >>>>> [..] >>>>> cuda: >>>>> Version: 11.7 >>>>> Includes: -I/path/to/cuda11.7/toolkit/11.7.1/include >>>>> Libraries: -Wl,-rpath,/path/to/cuda11.7/toolkit/11.7.1/lib64 >>>>> -L/cm/shared/apps/cuda11.7/toolkit/11.7.1/lib64 >>>>> -L/path/to/cuda11.7/toolkit/11.7.1/lib64/stubs -lcudart -lnvToolsExt >>>>> -lcufft -lcublas -lcusparse -lcusolver -lcurand -lcuda >>>>> CUDA SM 75 >>>>> CUDA underlying compiler: >>>>> CUDA_CXX="/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/bin"/mpicxx >>>>> CUDA underlying compiler flags: CUDA_CXXFLAGS= >>>>> CUDA underlying linker libraries: CUDA_CXXLIBS= >>>>> [...] >>>>> Configure stage complete. Now build PETSc libraries with: >>>>> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-c-opt all >>>>> >>>>> C++ compiler version: g++ (GCC) 10.2.0 >>>>> Using C++ compiler to compile PETSc >>>>> ----------------------------------------- >>>>> Using C/C++ linker: >>>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/bin/mpicxx >>>>> Using C/C++ flags: -Wall -Wwrite-strings -Wno-strict-aliasing >>>>> -Wno-unknown-pragmas -Wno-lto-type-mismatch -fstack-protector >>>>> -fvisibility=hidden -g -O0 >>>>> ----------------------------------------- >>>>> Using system modules: >>>>> shared:slurm/20.02.6:DefaultModules:openmpi/gcc/64/4.1.1_cuda_11.0.3_aware:gdal/3.3.0:cmake/3.22.1:cuda11.7/toolkit/11.7.1:openblas/dynamic/0.3.7:gcc/10.2.0 >>>>> Using mpi.h: # 1 >>>>> "/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include/mpi.h" 1 >>>>> ----------------------------------------- >>>>> Using libraries: -Wl,-rpath,/path/to/petsc/arch-linux-cxx-debug/lib >>>>> -L/path/to/petsc/arch-linux-cxx-debug/lib -lpetsc -lopenblas -lm -lX11 >>>>> -lquadmath -lstdc++ -ldl >>>>> ------------------------------------------ >>>>> Using mpiexec: mpiexec -mca orte_base_help_aggregate 0 -mca pml ucx >>>>> --mca btl '^openib' >>>>> ------------------------------------------ >>>>> Using MAKE: /path/to/petsc/arch-linux-cxx-debug/bin/make >>>>> Using MAKEFLAGS: -j24 -l48.0 --no-print-directory -- MPIEXEC=mpiexec\ >>>>> -mca\ orte_base_help_aggregate\ 0\ \ -mca\ pml\ ucx\ --mca\ btl\ '^openib' >>>>> PETSC_ARCH=arch-linux-cxx-debug PETSC_DIR=/path/to/petsc >>>>> ========================================== >>>>> make[3]: Nothing to be done for 'libs'. >>>>> ========================================= >>>>> Now to check if the libraries are working do: >>>>> make PETSC_DIR=/path/to/petsc PETSC_ARCH=arch-linux-cxx-debug check >>>>> ========================================= >>>>> [me at xxx petsc]$ make PETSC_DIR=/path/to/petsc >>>>> PETSC_ARCH=arch-linux-cxx-debug MPIEXEC="mpiexec -mca >>>>> orte_base_help_aggregate 0 -mca pml ucx --mca btl '^openib'" check >>>>> Running check examples to verify correct installation >>>>> Using PETSC_DIR=/path/to/petsc and PETSC_ARCH=arch-linux-cxx-debug >>>>> C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI >>>>> process >>>>> C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI >>>>> processes >>>>> >>>>> ./bandwidthTest >>>>> [CUDA Bandwidth Test] - Starting... >>>>> Running on... >>>>> >>>>> Device 0: Quadro RTX 8000 >>>>> Quick Mode >>>>> >>>>> Host to Device Bandwidth, 1 Device(s) >>>>> PINNED Memory Transfers >>>>> Transfer Size (Bytes) Bandwidth(GB/s) >>>>> 32000000 12.3 >>>>> >>>>> Device to Host Bandwidth, 1 Device(s) >>>>> PINNED Memory Transfers >>>>> Transfer Size (Bytes) Bandwidth(GB/s) >>>>> 32000000 13.2 >>>>> >>>>> Device to Device Bandwidth, 1 Device(s) >>>>> PINNED Memory Transfers >>>>> Transfer Size (Bytes) Bandwidth(GB/s) >>>>> 32000000 466.2 >>>>> >>>>> Result = PASS >>>>> >>>>> On Sat, Oct 8, 2022 at 7:56 PM Barry Smith wrote: >>>>> >>>>>> >>>>>> True, but when users send reports back to us they will never have >>>>>> used the VERBOSE=1 option, so it requires one more round trip of email to >>>>>> get this additional information. >>>>>> >>>>>> > On Oct 8, 2022, at 6:48 PM, Jed Brown wrote: >>>>>> > >>>>>> > Barry Smith writes: >>>>>> > >>>>>> >> I hate these kinds of make rules that hide what the compiler is >>>>>> doing (in the name of having less output, I guess) it makes it difficult to >>>>>> figure out what is going wrong. >>>>>> > >>>>>> > You can make VERBOSE=1 with CMake-generated makefiles. >>>>>> >>>>> >>>>> >>>>>> Anyways, either some of the MPI libraries are missing from the link >>>>>> line or they are in the wrong order and thus it is not able to search them >>>>>> properly. Here is a bunch of discussions on why that error message can >>>>>> appear >>>>>> https://stackoverflow.com/questions/19901934/libpthread-so-0-error-adding-symbols-dso-missing-from-command-line >>>>>> >>>>> >>>>> >>>>> Still same but more noise and I have been using the suggestion of >>>>> LDFLAGS="-Wl,--copy-dt-needed-entries" along with make: >>>>> make[2]: Entering directory '/path/to/WTM/build' >>>>> cd /path/to/WTM/build && >>>>> /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E cmake_depends "Unix >>>>> Makefiles" /path/to/WTM /path/to/WTM /path/to/WTM/build /path/to/WTM/build >>>>> /path/to/WTM/build/CMakeFiles/wtm.x.dir/DependInfo.cmake --color= >>>>> make[2]: Leaving directory '/path/to/WTM/build' >>>>> make -f CMakeFiles/wtm.x.dir/build.make CMakeFiles/wtm.x.dir/build >>>>> make[2]: Entering directory '/path/to/WTM/build' >>>>> [ 66%] Building CXX object CMakeFiles/wtm.x.dir/src/WTM.cpp.o >>>>> /cm/local/apps/gcc/10.2.0/bin/c++ >>>>> -I/path/to/WTM/common/richdem/include -I/path/to/gdal-3.3.0/include >>>>> -I/path/to/WTM/common/fmt/include -isystem >>>>> /path/to/petsc/arch-linux-cxx-debug/include -isystem /path/to/petsc/include >>>>> -isystem -O3 -g -Wall -Wextra -pedantic -Wshadow -Wfloat-conversion -Wall >>>>> -Wextra -pedantic -Wshadow -DRICHDEM_GIT_HASH=\"xxx\" >>>>> -DRICHDEM_COMPILE_TIME=\"2022-10-09T02:21:11Z\" -DUSEGDAL -Xpreprocessor >>>>> -fopenmp >>>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1 >>>>> -I/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/include -std=gnu++2a -MD >>>>> -MT CMakeFiles/wtm.x.dir/src/WTM.cpp.o -MF >>>>> CMakeFiles/wtm.x.dir/src/WTM.cpp.o.d -o CMakeFiles/wtm.x.dir/src/WTM.cpp.o >>>>> -c /path/to/WTM/src/WTM.cpp >>>>> c++: warning: >>>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40.30.1: >>>>> linker input file unused because linking not done >>>>> [ 70%] Linking CXX executable wtm.x >>>>> /path/to/cmake/cmake-3.22.1-linux-x86_64/bin/cmake -E >>>>> cmake_link_script CMakeFiles/wtm.x.dir/link.txt --verbose=1 >>>>> /cm/local/apps/gcc/10.2.0/bin/c++ -isystem -O3 -g -Wall -Wextra >>>>> -pedantic -Wshadow CMakeFiles/wtm.x.dir/src/WTM.cpp.o -o wtm.x >>>>> -Wl,-rpath,/path/to/WTM/build/common/richdem:/path/to/gdal-3.3.0/lib:/path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib:/path/to/petsc/arch-linux-cxx-debug/lib >>>>> libwtm.a common/richdem/librichdem.so /path/to/gdal-3.3.0/lib/libgdal.so >>>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libompitrace.so.40.30.0 >>>>> common/fmt/libfmt.a /path/to/petsc/arch-linux-cxx-debug/lib/libpetsc.so >>>>> /usr/bin/ld: CMakeFiles/wtm.x.dir/src/WTM.cpp.o: undefined reference >>>>> to symbol 'ompi_mpi_comm_self' >>>>> /path/to/openmpi-4.1.1_ucx_cuda_11.0.3_support/lib/libmpi.so.40: error >>>>> adding symbols: DSO missing from command line >>>>> collect2: error: ld returned 1 exit status >>>>> make[2]: *** [CMakeFiles/wtm.x.dir/build.make:103: wtm.x] Error 1 >>>>> make[2]: Leaving directory '/path/to/WTM/build' >>>>> make[1]: *** [CMakeFiles/Makefile2:225: CMakeFiles/wtm.x.dir/all] >>>>> Error 2 >>>>> make[1]: Leaving directory '/path/to/WTM/build' >>>>> make: *** [Makefile:136: all] Error 2 >>>>> >>>>> Anything stick out? >>>>> >>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From snailsoar at hotmail.com Mon Oct 10 10:42:48 2022 From: snailsoar at hotmail.com (feng wang) Date: Mon, 10 Oct 2022 15:42:48 +0000 Subject: [petsc-users] Slepc, shell matrix, parallel, halo exchange In-Reply-To: References: <53363D7B-CCBD-4DAB-924E-1D5D56975828@dsic.upv.es> <76162134-CDE9-42B9-8310-D9DD33D2F12D@dsic.upv.es> Message-ID: Hi Mat, Thanks for your reply. It seems I have to use "VecSetValues" to assign the values of my ghost vector "petsc_dcsv". and then call VecAssemblyBegin/End. If I do it this way, the ghost cells are exchanged correctly. Besides, I notice that, when I run my code sequentially or with multiple processors, the produced eigenvalues are similar, but the number of iterations are different to reach the specified "-eps_tol" and the relative residuals are also slightly different. Is this normal? I am using the default Krylov-Schur solver and double precision. Thanks, Feng ________________________________ From: Matthew Knepley Sent: 09 October 2022 12:11 To: feng wang Cc: Jose E. Roman ; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange On Fri, Oct 7, 2022 at 5:48 PM feng wang > wrote: Hi Mat, I've tried the suggested approach. The halo cells are not exchanged somehow. Below is how I do it, have I missed anything? I create a ghost vector petsc_dcsv and it is a data member of the class cFdDomain, which is a context of the shell matrix. PetscCall(VecCreateGhostBlock(*A_COMM_WORLD, blocksize, blocksize*nlocal, PETSC_DECIDE ,nghost, ighost, &petsc_dcsv)); blocksize and nv have the same value. nlocal is number of local cells and nghost is number of halo cells. ighost contains the ghost cell index. Below is how I compute a matrix-vector product with a shell matrix PetscErrorCode cFdDomain::mymult_slepc(Mat m ,Vec x, Vec y) { void *ctx; cFdDomain *myctx; PetscErrorCode ierr; MatShellGetContext(m, &ctx); myctx = (cFdDomain*)ctx; //matrix-vector product ierr = myctx->myfunc(x, y); CHKERRQ(ierr); ierr = 0; return ierr; } PetscErrorCode cFdDomain::myfunc(Vec in, Vec out) { //some declaration ierr = VecGetArray(petsc_dcsv,&array_g); CHKERRQ(ierr); ierr = VecGetArrayRead(in, &array); CHKERRQ(ierr); //assign in to petsc_dcsv, only local cells for(iv=0; iv> Sent: 21 September 2022 14:36 To: feng wang > Cc: Jose E. Roman >; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange On Wed, Sep 21, 2022 at 10:35 AM feng wang > wrote: Hi Jose, For your 2nd suggestion on halo exchange, I get the idea and roughly know how to do it, but there are some implementation details which I am not quite sure. If I understand it correctly, in MatMult(Mat m ,Vec x, Vec y), Vec x is a normal parallel vector and it does not contain halo values. Suppose I create an auxiliary ghost vector x_g, then I assign the values of x to x_g. The values of the halo for each partition will not be assigned at this stage. But If I call VecGhostUpdateBegin/End(x_g, INSERT_VALUES, SCATTER_FORWARD), this will fill the values of the halo cells of x_g for each partition. Then x_g has local and halo cells assigned correctly and I can use x_g to do my computation. Is this what you mean? Yes Matt Thanks, Feng ________________________________ From: Jose E. Roman > Sent: 21 September 2022 13:07 To: feng wang > Cc: Matthew Knepley >; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > El 21 sept 2022, a las 14:47, feng wang > escribi?: > > Thanks Jose, I will try this and will come back to this thread if I have any issue. > > Besides, for EPSGetEigenpair, I guess each rank gets its portion of the eigenvector, and I need to put them together afterwards? Eigenvectors are stored in parallel vectors, which are used in subsequent parallel computation in most applications. If for some reason you need to gather them in a single MPI process you can use e.g. VecScatterCreateToZero() > > Thanks, > Feng > > From: Jose E. Roman > > Sent: 21 September 2022 12:34 > To: feng wang > > Cc: Matthew Knepley >; petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > If you define the MATOP_CREATE_VECS operation in your shell matrix so that it creates a ghost vector, then all vectors within EPS will be ghost vectors, including those that are received as arguments of MatMult(). Not sure if this will work. > > A simpler solution is that you store a ghost vector in the context of your shell matrix, and then in MatMult() you receive a regular parallel vector x, then update the ghost points using the auxiliary ghost vector, do the computation and store the result in the regular parallel vector y. > > Jose > > > > El 21 sept 2022, a las 14:09, feng wang > escribi?: > > > > Thanks for your reply. > > > > For GMRES, I create a ghost vector and give it to KSPSolve. For Slepc, it only takes the shell matrix for EPSSetOperators. Suppose the shell matrix of the eigensolver defines MatMult(Mat m ,Vec x, Vec y), how does it know Vec x is a ghost vector and how many ghost cells there are? > > > > Thanks, > > Feng > > From: Matthew Knepley > > > Sent: 21 September 2022 11:58 > > To: feng wang > > > Cc: petsc-users at mcs.anl.gov > > > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > > > On Wed, Sep 21, 2022 at 7:41 AM feng wang > wrote: > > Hello, > > > > I am using Slepc with a shell matrix. The sequential version seems working and now I am trying to make it run in parallel. > > > > The partition of the domain is done, I am not sure how to do the halo exchange in the shell matrix in Slepc. I have a parallel version of matrix-free GMRES in my code with Petsc. I was using VecCreateGhostBlock to create vector with ghost cells, and then used VecGhostUpdateBegin/End for the halo exchange in the shell matrix, would this be the same for Slepc? > > > > That will be enough for the MatMult(). You would also have to use a SLEPc EPS that only needed MatMult(). > > > > Thanks, > > > > Matt > > > > Thanks, > > Feng > > > > > > > > > > -- > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Mon Oct 10 10:49:28 2022 From: jroman at dsic.upv.es (Jose E. Roman) Date: Mon, 10 Oct 2022 17:49:28 +0200 Subject: [petsc-users] Slepc, shell matrix, parallel, halo exchange In-Reply-To: References: <53363D7B-CCBD-4DAB-924E-1D5D56975828@dsic.upv.es> <76162134-CDE9-42B9-8310-D9DD33D2F12D@dsic.upv.es> Message-ID: <7D7B8664-623C-4704-8E30-8EEEDC0C7FA5@dsic.upv.es> > El 10 oct 2022, a las 17:42, feng wang escribi?: > > Hi Mat, > > Thanks for your reply. It seems I have to use "VecSetValues" to assign the values of my ghost vector "petsc_dcsv". and then call VecAssemblyBegin/End. If I do it this way, the ghost cells are exchanged correctly. > > Besides, I notice that, when I run my code sequentially or with multiple processors, the produced eigenvalues are similar, but the number of iterations are different to reach the specified "-eps_tol" and the relative residuals are also slightly different. Is this normal? I am using the default Krylov-Schur solver and double precision. The number of iterations depends on the initial vector which is random by default, and the random vector is not the same when you change the number of processes. So yes, it is normal. If you want to get the same convergence history, use a constant initial vector set via EPSSetInitialSpace(), or alternatively use the undocumented option -bv_reproducible_random Jose > > Thanks, > Feng > From: Matthew Knepley > Sent: 09 October 2022 12:11 > To: feng wang > Cc: Jose E. Roman ; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > On Fri, Oct 7, 2022 at 5:48 PM feng wang wrote: > Hi Mat, > > I've tried the suggested approach. The halo cells are not exchanged somehow. Below is how I do it, have I missed anything? > > I create a ghost vector petsc_dcsv and it is a data member of the class cFdDomain, which is a context of the shell matrix. > > PetscCall(VecCreateGhostBlock(*A_COMM_WORLD, blocksize, blocksize*nlocal, PETSC_DECIDE ,nghost, ighost, &petsc_dcsv)); > > blocksize and nv have the same value. nlocal is number of local cells and nghost is number of halo cells. ighost contains the ghost cell index. > > Below is how I compute a matrix-vector product with a shell matrix > > PetscErrorCode cFdDomain::mymult_slepc(Mat m ,Vec x, Vec y) > { > void *ctx; > cFdDomain *myctx; > PetscErrorCode ierr; > > MatShellGetContext(m, &ctx); > myctx = (cFdDomain*)ctx; > > //matrix-vector product > ierr = myctx->myfunc(x, y); CHKERRQ(ierr); > > ierr = 0; > return ierr; > } > > > PetscErrorCode cFdDomain::myfunc(Vec in, Vec out) > { > //some declaration > > ierr = VecGetArray(petsc_dcsv,&array_g); CHKERRQ(ierr); > ierr = VecGetArrayRead(in, &array); CHKERRQ(ierr); > > //assign in to petsc_dcsv, only local cells > for(iv=0; iv { > for(iq=0; iq { > array_g[iv+nv*iq] = array[iv + nv*iq]; > } > } > > ierr = VecRestoreArray(petsc_dcsv,&array_g); CHKERRQ(ierr); > ierr = VecRestoreArrayRead(in, &array); CHKERRQ(ierr); > > //update halo cells? > PetscCall(VecGhostUpdateBegin(petsc_dcsv, INSERT_VALUES, SCATTER_FORWARD)); > PetscCall(VecGhostUpdateEnd(petsc_dcsv, INSERT_VALUES, SCATTER_FORWARD)); > PetscCall(VecGhostGetLocalForm(petsc_dcsv,&veclocal)); > > //read in v > ierr = VecGetArray(veclocal,&array_ghost); CHKERRQ(ierr); > for(iv=0; iv { > for(iq=0; iq { > jq = ilocal[iq]; > dq[iv][jq] = array_ghost[iv + nv*iq]; > } > > for(iq=nlocal; iq { > jq = ighost_local[iq-nlocal]; > dq[iv][jq] = array_ghost[iv + nv*iq]; > } > } > ierr = VecRestoreArray(veclocal,&array_ghost); CHKERRQ(ierr); > > //some computations > > PetscCall(VecGhostRestoreLocalForm(petsc_dcsv,&veclocal)); > } > > > so I fill the local part of the ghost vector petsc_dcsv for each rank and then call ghost update, and think this will update the halo cells. it seems not doing that. > > I can only think you are misinterpreting the result. There are many examples, such > > src/vec/tutorials/ex9.c (and ex9f.F) > > I would start there and try to change that into the communication you want, since it definitely works. I cannot > see a problem with the code snippet above. > > Thanks, > > Matt > > Thanks, > Feng > > From: Matthew Knepley > Sent: 21 September 2022 14:36 > To: feng wang > Cc: Jose E. Roman ; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > On Wed, Sep 21, 2022 at 10:35 AM feng wang wrote: > Hi Jose, > > For your 2nd suggestion on halo exchange, I get the idea and roughly know how to do it, but there are some implementation details which I am not quite sure. > > If I understand it correctly, in MatMult(Mat m ,Vec x, Vec y), Vec x is a normal parallel vector and it does not contain halo values. Suppose I create an auxiliary ghost vector x_g, then I assign the values of x to x_g. The values of the halo for each partition will not be assigned at this stage. > > But If I call VecGhostUpdateBegin/End(x_g, INSERT_VALUES, SCATTER_FORWARD), this will fill the values of the halo cells of x_g for each partition. Then x_g has local and halo cells assigned correctly and I can use x_g to do my computation. Is this what you mean? > > Yes > > Matt > > Thanks, > Feng > > From: Jose E. Roman > Sent: 21 September 2022 13:07 > To: feng wang > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > > > > El 21 sept 2022, a las 14:47, feng wang escribi?: > > > > Thanks Jose, I will try this and will come back to this thread if I have any issue. > > > > Besides, for EPSGetEigenpair, I guess each rank gets its portion of the eigenvector, and I need to put them together afterwards? > > Eigenvectors are stored in parallel vectors, which are used in subsequent parallel computation in most applications. If for some reason you need to gather them in a single MPI process you can use e.g. VecScatterCreateToZero() > > > > > Thanks, > > Feng > > > > From: Jose E. Roman > > Sent: 21 September 2022 12:34 > > To: feng wang > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > > > If you define the MATOP_CREATE_VECS operation in your shell matrix so that it creates a ghost vector, then all vectors within EPS will be ghost vectors, including those that are received as arguments of MatMult(). Not sure if this will work. > > > > A simpler solution is that you store a ghost vector in the context of your shell matrix, and then in MatMult() you receive a regular parallel vector x, then update the ghost points using the auxiliary ghost vector, do the computation and store the result in the regular parallel vector y. > > > > Jose > > > > > > > El 21 sept 2022, a las 14:09, feng wang escribi?: > > > > > > Thanks for your reply. > > > > > > For GMRES, I create a ghost vector and give it to KSPSolve. For Slepc, it only takes the shell matrix for EPSSetOperators. Suppose the shell matrix of the eigensolver defines MatMult(Mat m ,Vec x, Vec y), how does it know Vec x is a ghost vector and how many ghost cells there are? > > > > > > Thanks, > > > Feng > > > From: Matthew Knepley > > > Sent: 21 September 2022 11:58 > > > To: feng wang > > > Cc: petsc-users at mcs.anl.gov > > > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > > > > > On Wed, Sep 21, 2022 at 7:41 AM feng wang wrote: > > > Hello, > > > > > > I am using Slepc with a shell matrix. The sequential version seems working and now I am trying to make it run in parallel. > > > > > > The partition of the domain is done, I am not sure how to do the halo exchange in the shell matrix in Slepc. I have a parallel version of matrix-free GMRES in my code with Petsc. I was using VecCreateGhostBlock to create vector with ghost cells, and then used VecGhostUpdateBegin/End for the halo exchange in the shell matrix, would this be the same for Slepc? > > > > > > That will be enough for the MatMult(). You would also have to use a SLEPc EPS that only needed MatMult(). > > > > > > Thanks, > > > > > > Matt > > > > > > Thanks, > > > Feng > > > > > > > > > > > > > > > -- > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > -- Norbert Wiener > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ From knepley at gmail.com Mon Oct 10 10:52:35 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 10 Oct 2022 11:52:35 -0400 Subject: [petsc-users] Slepc, shell matrix, parallel, halo exchange In-Reply-To: References: <53363D7B-CCBD-4DAB-924E-1D5D56975828@dsic.upv.es> <76162134-CDE9-42B9-8310-D9DD33D2F12D@dsic.upv.es> Message-ID: On Mon, Oct 10, 2022 at 11:42 AM feng wang wrote: > Hi Mat, > > Thanks for your reply. It seems I have to use "VecSetValues" to assign the > values of my ghost vector "petsc_dcsv". and then call VecAssemblyBegin/End. > If I do it this way, the ghost cells are exchanged correctly. > This should only be true if you are modifying off-process values. If not, that does not seem right. Thanks, Matt > Besides, I notice that, when I run my code sequentially or with multiple > processors, the produced eigenvalues are similar, but the number of > iterations are different to reach the specified "-eps_tol" and the relative > residuals are also slightly different. Is this normal? I am using the > default Krylov-Schur solver and double precision. > > Thanks, > Feng > ------------------------------ > *From:* Matthew Knepley > *Sent:* 09 October 2022 12:11 > *To:* feng wang > *Cc:* Jose E. Roman ; petsc-users at mcs.anl.gov < > petsc-users at mcs.anl.gov> > *Subject:* Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > On Fri, Oct 7, 2022 at 5:48 PM feng wang wrote: > > Hi Mat, > > I've tried the suggested approach. The halo cells are not exchanged > somehow. Below is how I do it, have I missed anything? > > I create a ghost vector *petsc_dcsv* and it is a data member of the class > cFdDomain, which is a context of the shell matrix. > > * PetscCall(VecCreateGhostBlock(*A_COMM_WORLD, blocksize, > blocksize*nlocal, PETSC_DECIDE ,nghost, ighost, &petsc_dcsv));* > > blocksize and nv have the same value. nlocal is number of local cells and > nghost is number of halo cells. ighost contains the ghost cell index. > > Below is how I compute a matrix-vector product with a shell matrix > > * PetscErrorCode cFdDomain::mymult_slepc(Mat m ,Vec x, Vec y)* > * {* > * void *ctx;* > * cFdDomain *myctx;* > * PetscErrorCode ierr;* > > * MatShellGetContext(m, &ctx);* > * myctx = (cFdDomain*)ctx;* > > *//matrix-vector product* > * ierr = myctx->myfunc(x, y); CHKERRQ(ierr);* > > * ierr = 0;* > * return ierr;* > * }* > > > * PetscErrorCode cFdDomain::myfunc(Vec in, Vec out)* > * {* > > *//some declaration * > > * ierr = VecGetArray(petsc_dcsv,&array_g); CHKERRQ(ierr);* > * ierr = VecGetArrayRead(in, &array); CHKERRQ(ierr);* > > * //assign in to petsc_dcsv, only local cells* > * for(iv=0; iv * {* > * for(iq=0; iq * {* > * array_g[iv+nv*iq] = array[iv + nv*iq];* > * }* > * }* > > * ierr = VecRestoreArray(petsc_dcsv,&array_g); CHKERRQ(ierr);* > * ierr = VecRestoreArrayRead(in, &array); CHKERRQ(ierr);* > > * //update halo cells?* > * PetscCall(VecGhostUpdateBegin(petsc_dcsv, INSERT_VALUES, > SCATTER_FORWARD));* > * PetscCall(VecGhostUpdateEnd(petsc_dcsv, INSERT_VALUES, > SCATTER_FORWARD));* > * PetscCall(VecGhostGetLocalForm(petsc_dcsv,&veclocal));* > > *//read in v* > * ierr = VecGetArray(veclocal,&array_ghost); CHKERRQ(ierr);* > * for(iv=0; iv * {* > * for(iq=0; iq * {* > * jq = ilocal[iq];* > * dq[iv][jq] = array_ghost[iv + nv*iq];* > * }* > > * for(iq=nlocal; iq * {* > * jq = ighost_local[iq-nlocal];* > * dq[iv][jq] = array_ghost[iv + nv*iq];* > * }* > * }* > * ierr = VecRestoreArray(veclocal,&array_ghost); CHKERRQ(ierr);* > > > * //some computations * > > > * PetscCall(VecGhostRestoreLocalForm(petsc_dcsv,&veclocal)); * > * }* > > > so I fill the local part of the ghost vector *petsc_dcsv* for each rank > and then call ghost update, and think this will update the halo cells. it > seems not doing that. > > > I can only think you are misinterpreting the result. There are many > examples, such > > src/vec/tutorials/ex9.c (and ex9f.F) > > I would start there and try to change that into the communication you > want, since it definitely works. I cannot > see a problem with the code snippet above. > > Thanks, > > Matt > > > Thanks, > Feng > > ------------------------------ > *From:* Matthew Knepley > *Sent:* 21 September 2022 14:36 > *To:* feng wang > *Cc:* Jose E. Roman ; petsc-users at mcs.anl.gov < > petsc-users at mcs.anl.gov> > *Subject:* Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > On Wed, Sep 21, 2022 at 10:35 AM feng wang wrote: > > Hi Jose, > > For your 2nd suggestion on halo exchange, I get the idea and roughly know > how to do it, but there are some implementation details which I am not > quite sure. > > If I understand it correctly, in MatMult(Mat m ,Vec x, Vec y), Vec *x* is > a normal parallel vector and it does not contain halo values. Suppose I > create an auxiliary ghost vector * x_g*, then I assign the values of *x* > to *x_g*. The values of the halo for each partition will not be assigned > at this stage. > > But If I call VecGhostUpdateBegin/End(*x_g*, INSERT_VALUES, > SCATTER_FORWARD), this will fill the values of the halo cells of *x_g *for > each partition. Then *x_g* has local and halo cells assigned correctly > and I can use *x_g* to do my computation. Is this what you mean? > > > Yes > > Matt > > > Thanks, > Feng > > ------------------------------ > *From:* Jose E. Roman > *Sent:* 21 September 2022 13:07 > *To:* feng wang > *Cc:* Matthew Knepley ; petsc-users at mcs.anl.gov < > petsc-users at mcs.anl.gov> > *Subject:* Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > > > > El 21 sept 2022, a las 14:47, feng wang > escribi?: > > > > Thanks Jose, I will try this and will come back to this thread if I have > any issue. > > > > Besides, for EPSGetEigenpair, I guess each rank gets its portion of the > eigenvector, and I need to put them together afterwards? > > Eigenvectors are stored in parallel vectors, which are used in subsequent > parallel computation in most applications. If for some reason you need to > gather them in a single MPI process you can use e.g. > VecScatterCreateToZero() > > > > > Thanks, > > Feng > > > > From: Jose E. Roman > > Sent: 21 September 2022 12:34 > > To: feng wang > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov < > petsc-users at mcs.anl.gov> > > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > > > If you define the MATOP_CREATE_VECS operation in your shell matrix so > that it creates a ghost vector, then all vectors within EPS will be ghost > vectors, including those that are received as arguments of MatMult(). Not > sure if this will work. > > > > A simpler solution is that you store a ghost vector in the context of > your shell matrix, and then in MatMult() you receive a regular parallel > vector x, then update the ghost points using the auxiliary ghost vector, do > the computation and store the result in the regular parallel vector y. > > > > Jose > > > > > > > El 21 sept 2022, a las 14:09, feng wang > escribi?: > > > > > > Thanks for your reply. > > > > > > For GMRES, I create a ghost vector and give it to KSPSolve. For Slepc, > it only takes the shell matrix for EPSSetOperators. Suppose the shell > matrix of the eigensolver defines MatMult(Mat m ,Vec x, Vec y), how does it > know Vec x is a ghost vector and how many ghost cells there are? > > > > > > Thanks, > > > Feng > > > From: Matthew Knepley > > > Sent: 21 September 2022 11:58 > > > To: feng wang > > > Cc: petsc-users at mcs.anl.gov > > > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > > > > > On Wed, Sep 21, 2022 at 7:41 AM feng wang > wrote: > > > Hello, > > > > > > I am using Slepc with a shell matrix. The sequential version seems > working and now I am trying to make it run in parallel. > > > > > > The partition of the domain is done, I am not sure how to do the halo > exchange in the shell matrix in Slepc. I have a parallel version of > matrix-free GMRES in my code with Petsc. I was using VecCreateGhostBlock to > create vector with ghost cells, and then used VecGhostUpdateBegin/End for > the halo exchange in the shell matrix, would this be the same for Slepc? > > > > > > That will be enough for the MatMult(). You would also have to use a > SLEPc EPS that only needed MatMult(). > > > > > > Thanks, > > > > > > Matt > > > > > > Thanks, > > > Feng > > > > > > > > > > > > > > > -- > > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > > -- Norbert Wiener > > > > > > https://www.cse.buffalo.edu/~knepley/ > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Mon Oct 10 16:36:53 2022 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Mon, 10 Oct 2022 14:36:53 -0700 Subject: [petsc-users] MSPIN Message-ID: I know that PETSc has native support for ASPIN. Has anyone tried MSPIN? I wouldn't be surprised if someone has implemented it in user code. Wondering what the barriers would be to creating an option like `-snes_type mspin` ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Oct 10 17:47:23 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 10 Oct 2022 18:47:23 -0400 Subject: [petsc-users] MSPIN In-Reply-To: References: Message-ID: On Mon, Oct 10, 2022 at 5:37 PM Alexander Lindsay wrote: > I know that PETSc has native support for ASPIN. Has anyone tried MSPIN? I > wouldn't be surprised if someone has implemented it in user code. Wondering > what the barriers would be to creating an option like `-snes_type mspin` ? > David Keyes, LuLu Liu, and collaborators have several papers on MSPIN. It does work well in many circumstances. ASPIN is easy for PETSc because it only involves generating sub-Jacobians. MSPIN needs nonlinear subsystems. This is not possible with the traditional PETSc SNES callback interface. It is just barely possible using all the experimental stuff in Plex. You need the ability to subset the domain, setup a nonlinear problem with the same equations and boundary conditions (and normally homogeneous Dirichlet on the internal boundary), and usually linearizations of this subproblem. This means abstractions for the mesh, equations, boundary conditions, and linearizations that can be transferred onto a new subdomain. Thanks, Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mi.mike1021 at gmail.com Mon Oct 10 21:41:13 2022 From: mi.mike1021 at gmail.com (Mike Michell) Date: Tue, 11 Oct 2022 11:41:13 +0900 Subject: [petsc-users] DMLocalToLocal with DMPlex in Fortran In-Reply-To: References: Message-ID: Hi, I was wondering if there is any comment on the example file that I can refer to. Thanks, Mike > Thank you for the reply. > Sure, a short example code is attached here with a square box mesh and a > run script. > Inside the source, you may find two versions of halo exchange; one is for > local to global (Version-1) and another one is for local to local > (Version-2), which is not working in my case. > In the output.vtu, you will see the halo exchanged vector resolved to each > vertex with (myrank + 1), so if the code is running with 2procs, at the > parallel boundary, you will see 3. In this example, there is no ghost layer. > > Thanks, > Mike > > >> On Sat, Oct 1, 2022 at 8:51 PM Mike Michell >> wrote: >> >>> Thank you for the reply. There is that file in >>> src/dm/interface/ftn-auto/ for me, instead of the path you mentioned. >>> >>> After "make allfortranstubs" was done and, PETSc reconfigured and >>> reinstalled. >>> >>> However, I still have the same problem at the line in which >>> DMLocalToLocalBegin() is used. What I am doing to setup halo exchange is as >>> follows; >>> - declare DMPlex >>> - PetscSectionCreate() >>> - PetscSectionSetNumFields() >>> - PetscSectionSetFieldComponents() >>> - PetscSectionSetChart() >>> - do loop over dofs: PetscSectionSetDof() and PetscSectionSetFieldDof() >>> - PetscSectionSetUp() >>> - DMSetLocalSection() >>> - PetscSectionDestroy() >>> - DMGetSectionSF() >>> - PetscSFSetUp() >>> >>> Then, the halo exchange is performed as follows; >>> - DMGetLocalVector() >>> - Fill the local vector >>> - DMLocalToLocalBegin() --(*) >>> - DMLocalToLocalEnd() >>> - DMRestoreLocalVector() >>> >>> Then, the code crashes at (*). >>> >> >> Can you send something I can run? Then I will find the problem and fix it. >> >> Thanks, >> >> Matt >> >> >>> Previously(at the time PETSc did not support LocalToLocal for DMPlex in >>> Fortran), the part above, "DMLocalToLocalBegin() and DMLocalToLocalEnd()", >>> consisted of; >>> - DMLocalToGlobalBegin() >>> - DMLocalToGlobalEnd() >>> - DMGlobalToLocalBegin() >>> - DMGlobalToLocalEnd() >>> and it worked okay. >>> >>> I am unclear which part is causing the problem. Shall I define the >>> PetscSection and PetscSF in different ways in case of Local to Local Halo >>> exchange? >>> Any comment will be appreciated. >>> >>> Thanks, >>> Mike >>> >>> >>> >>>> On Fri, Sep 30, 2022 at 4:14 PM Mike Michell >>>> wrote: >>>> >>>>> Hi, >>>>> >>>>> As a follow-up to this email thread, >>>>> https://www.mail-archive.com/petsc-users at mcs.anl.gov/msg44070.html >>>>> >>>>> Are DMLocalToLocalBegin() and DMLocalToLocalEnd() really available for >>>>> DMPlex with Fortran on the latest version of PETSc (3.17.99 from GitLab)? >>>>> Matt commented that the Fortran bindings were updated so that those >>>>> functions must be available in the latest version of PETSc, however, it >>>>> seems still they are not working from my test with DMPlex in Fortran. Can >>>>> anyone provide some comments? Probably I am missing some mandatory header >>>>> file? Currently, I have headers; >>>>> >>>>> #include "petsc/finclude/petscvec.h" >>>>> #include "petsc/finclude/petscdmplex.h" >>>>> #include "petsc/finclude/petscdmlabel.h" >>>>> #include "petsc/finclude/petscdm.h" >>>>> >>>> >>>> The source for these functions is in >>>> >>>> src/dm/ftn-auto/dmf.c >>>> >>>> Is it there for you? If not, you can run >>>> >>>> make allfortranstubs >>>> >>>> Fortran functions are not declared, so the header should not matter for >>>> compilation, just the libraries for linking. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> Thanks, >>>>> Mike >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhan2355 at purdue.edu Mon Oct 10 22:54:28 2022 From: zhan2355 at purdue.edu (Sijie Zhang) Date: Tue, 11 Oct 2022 03:54:28 +0000 Subject: [petsc-users] make all check error Message-ID: Hi, When I try to install petsc on my PC and run the make all check commend it has the following error. Can you help me to troubleshoot that? Thanks. Sijie ++++++++++++++++++++++++++++++++++++++++++++++++ Running check examples to verify correct installation Using PETSC_DIR=/home/zhangsijie1995/Documents/Packages/petsc-3.18.0 and PETSC_ARCH=arch-linux-c-debug Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process See https://petsc.org/release/faq/ hwloc/linux: Ignoring PCI device with non-16bit domain. Pass --enable-32bits-pci-domain to configure to support such devices (warning: it would break the library ABI, don't enable unless really needed). lid velocity = 0.0016, prandtl # = 1., grashof # = 1. Number of SNES iterations = 2 Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes See https://petsc.org/release/faq/ hwloc/linux: Ignoring PCI device with non-16bit domain. Pass --enable-32bits-pci-domain to configure to support such devices (warning: it would break the library ABI, don't enable unless really needed). hwloc/linux: Ignoring PCI device with non-16bit domain. Pass --enable-32bits-pci-domain to configure to support such devices (warning: it would break the library ABI, don't enable unless really needed). lid velocity = 0.0016, prandtl # = 1., grashof # = 1. Number of SNES iterations = 2 Possible error running Fortran example src/snes/tutorials/ex5f with 1 MPI process See https://petsc.org/release/faq/ hwloc/linux: Ignoring PCI device with non-16bit domain. Pass --enable-32bits-pci-domain to configure to support such devices (warning: it would break the library ABI, don't enable unless really needed). Number of SNES iterations = 3 Completed test examples Error while running make check gmake[1]: *** [makefile:149: check] Error 1 make: *** [GNUmakefile:17: check] Error 2 ++++++++++++++++++++++++++++++++++++++++++++++++ -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: text/x-log Size: 1269287 bytes Desc: configure.log URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log Type: text/x-log Size: 113844 bytes Desc: make.log URL: From knepley at gmail.com Tue Oct 11 04:10:50 2022 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 11 Oct 2022 05:10:50 -0400 Subject: [petsc-users] make all check error In-Reply-To: References: Message-ID: hwloc is giving this warning on your machine: hwloc/linux: Ignoring PCI device with non-16bit domain. Pass --enable-32bits-pci-domain to configure to support such devices (warning: it would break the library ABI, don't enable unless really needed). The PETSc results are fine, so you can use your installation. I have not seen the warning before. Thanks, Matt On Tue, Oct 11, 2022 at 12:32 AM Sijie Zhang wrote: > Hi, > > When I try to install petsc on my PC and run the make all check commend it > has the following error. Can you help me to troubleshoot that? > > Thanks. > > Sijie > > ++++++++++++++++++++++++++++++++++++++++++++++++ > Running check examples to verify correct installation > Using PETSC_DIR=/home/zhangsijie1995/Documents/Packages/petsc-3.18.0 and > PETSC_ARCH=arch-linux-c-debug > Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process > See https://petsc.org/release/faq/ > hwloc/linux: Ignoring PCI device with non-16bit domain. > Pass --enable-32bits-pci-domain to configure to support such devices > (warning: it would break the library ABI, don't enable unless really > needed). > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > Number of SNES iterations = 2 > Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes > See https://petsc.org/release/faq/ > hwloc/linux: Ignoring PCI device with non-16bit domain. > Pass --enable-32bits-pci-domain to configure to support such devices > (warning: it would break the library ABI, don't enable unless really > needed). > hwloc/linux: Ignoring PCI device with non-16bit domain. > Pass --enable-32bits-pci-domain to configure to support such devices > (warning: it would break the library ABI, don't enable unless really > needed). > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > Number of SNES iterations = 2 > Possible error running Fortran example src/snes/tutorials/ex5f with 1 MPI > process > See https://petsc.org/release/faq/ > hwloc/linux: Ignoring PCI device with non-16bit domain. > Pass --enable-32bits-pci-domain to configure to support such devices > (warning: it would break the library ABI, don't enable unless really > needed). > Number of SNES iterations = 3 > Completed test examples > Error while running make check > gmake[1]: *** [makefile:149: check] Error 1 > make: *** [GNUmakefile:17: check] Error 2 > ++++++++++++++++++++++++++++++++++++++++++++++++ > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Tue Oct 11 07:52:02 2022 From: bsmith at petsc.dev (Barry Smith) Date: Tue, 11 Oct 2022 08:52:02 -0400 Subject: [petsc-users] make all check error In-Reply-To: References: Message-ID: https://petsc.org/release/faq/#what-does-the-message-hwloc-linux-ignoring-pci-device-with-non-16bit-domain-mean > On Oct 10, 2022, at 11:54 PM, Sijie Zhang wrote: > > Hi, > > When I try to install petsc on my PC and run the make all check commend it has the following error. Can you help me to troubleshoot that? > > Thanks. > > Sijie > > ++++++++++++++++++++++++++++++++++++++++++++++++ > Running check examples to verify correct installation > Using PETSC_DIR=/home/zhangsijie1995/Documents/Packages/petsc-3.18.0 and PETSC_ARCH=arch-linux-c-debug > Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process > See https://petsc.org/release/faq/ > hwloc/linux: Ignoring PCI device with non-16bit domain. > Pass --enable-32bits-pci-domain to configure to support such devices > (warning: it would break the library ABI, don't enable unless really needed). > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > Number of SNES iterations = 2 > Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes > See https://petsc.org/release/faq/ > hwloc/linux: Ignoring PCI device with non-16bit domain. > Pass --enable-32bits-pci-domain to configure to support such devices > (warning: it would break the library ABI, don't enable unless really needed). > hwloc/linux: Ignoring PCI device with non-16bit domain. > Pass --enable-32bits-pci-domain to configure to support such devices > (warning: it would break the library ABI, don't enable unless really needed). > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > Number of SNES iterations = 2 > Possible error running Fortran example src/snes/tutorials/ex5f with 1 MPI process > See https://petsc.org/release/faq/ > hwloc/linux: Ignoring PCI device with non-16bit domain. > Pass --enable-32bits-pci-domain to configure to support such devices > (warning: it would break the library ABI, don't enable unless really needed). > Number of SNES iterations = 3 > Completed test examples > Error while running make check > gmake[1]: *** [makefile:149: check] Error 1 > make: *** [GNUmakefile:17: check] Error 2 > ++++++++++++++++++++++++++++++++++++++++++++++++ > From knepley at gmail.com Tue Oct 11 08:09:52 2022 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 11 Oct 2022 09:09:52 -0400 Subject: [petsc-users] make all check error In-Reply-To: References: Message-ID: On Tue, Oct 11, 2022 at 8:52 AM Barry Smith wrote: > > > https://petsc.org/release/faq/#what-does-the-message-hwloc-linux-ignoring-pci-device-with-non-16bit-domain-mean Ah. Obscure environment variables, not present in the error message, are always the best way to deal with this. Stay classy, hwloc! Matt > > > On Oct 10, 2022, at 11:54 PM, Sijie Zhang wrote: > > > > Hi, > > > > When I try to install petsc on my PC and run the make all check commend > it has the following error. Can you help me to troubleshoot that? > > > > Thanks. > > > > Sijie > > > > ++++++++++++++++++++++++++++++++++++++++++++++++ > > Running check examples to verify correct installation > > Using PETSC_DIR=/home/zhangsijie1995/Documents/Packages/petsc-3.18.0 and > PETSC_ARCH=arch-linux-c-debug > > Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process > > See https://petsc.org/release/faq/ > > hwloc/linux: Ignoring PCI device with non-16bit domain. > > Pass --enable-32bits-pci-domain to configure to support such devices > > (warning: it would break the library ABI, don't enable unless really > needed). > > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > > Number of SNES iterations = 2 > > Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes > > See https://petsc.org/release/faq/ > > hwloc/linux: Ignoring PCI device with non-16bit domain. > > Pass --enable-32bits-pci-domain to configure to support such devices > > (warning: it would break the library ABI, don't enable unless really > needed). > > hwloc/linux: Ignoring PCI device with non-16bit domain. > > Pass --enable-32bits-pci-domain to configure to support such devices > > (warning: it would break the library ABI, don't enable unless really > needed). > > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > > Number of SNES iterations = 2 > > Possible error running Fortran example src/snes/tutorials/ex5f with 1 > MPI process > > See https://petsc.org/release/faq/ > > hwloc/linux: Ignoring PCI device with non-16bit domain. > > Pass --enable-32bits-pci-domain to configure to support such devices > > (warning: it would break the library ABI, don't enable unless really > needed). > > Number of SNES iterations = 3 > > Completed test examples > > Error while running make check > > gmake[1]: *** [makefile:149: check] Error 1 > > make: *** [GNUmakefile:17: check] Error 2 > > ++++++++++++++++++++++++++++++++++++++++++++++++ > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Oct 11 08:17:47 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 11 Oct 2022 08:17:47 -0500 (CDT) Subject: [petsc-users] make all check error In-Reply-To: References: Message-ID: <4c1189b9-1f3a-90ab-6aa6-097fb7f8dd1c@mcs.anl.gov> Alternative: build/use mpich without hwloc. i.e: --download-mpich=1 --with-hwloc=0 Satish On Tue, 11 Oct 2022, Matthew Knepley wrote: > On Tue, Oct 11, 2022 at 8:52 AM Barry Smith wrote: > > > > > > > https://petsc.org/release/faq/#what-does-the-message-hwloc-linux-ignoring-pci-device-with-non-16bit-domain-mean > > > Ah. Obscure environment variables, not present in the error message, are > always the best way to deal with this. Stay classy, hwloc! > > Matt > > > > > > > On Oct 10, 2022, at 11:54 PM, Sijie Zhang wrote: > > > > > > Hi, > > > > > > When I try to install petsc on my PC and run the make all check commend > > it has the following error. Can you help me to troubleshoot that? > > > > > > Thanks. > > > > > > Sijie > > > > > > ++++++++++++++++++++++++++++++++++++++++++++++++ > > > Running check examples to verify correct installation > > > Using PETSC_DIR=/home/zhangsijie1995/Documents/Packages/petsc-3.18.0 and > > PETSC_ARCH=arch-linux-c-debug > > > Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process > > > See https://petsc.org/release/faq/ > > > hwloc/linux: Ignoring PCI device with non-16bit domain. > > > Pass --enable-32bits-pci-domain to configure to support such devices > > > (warning: it would break the library ABI, don't enable unless really > > needed). > > > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > > > Number of SNES iterations = 2 > > > Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes > > > See https://petsc.org/release/faq/ > > > hwloc/linux: Ignoring PCI device with non-16bit domain. > > > Pass --enable-32bits-pci-domain to configure to support such devices > > > (warning: it would break the library ABI, don't enable unless really > > needed). > > > hwloc/linux: Ignoring PCI device with non-16bit domain. > > > Pass --enable-32bits-pci-domain to configure to support such devices > > > (warning: it would break the library ABI, don't enable unless really > > needed). > > > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > > > Number of SNES iterations = 2 > > > Possible error running Fortran example src/snes/tutorials/ex5f with 1 > > MPI process > > > See https://petsc.org/release/faq/ > > > hwloc/linux: Ignoring PCI device with non-16bit domain. > > > Pass --enable-32bits-pci-domain to configure to support such devices > > > (warning: it would break the library ABI, don't enable unless really > > needed). > > > Number of SNES iterations = 3 > > > Completed test examples > > > Error while running make check > > > gmake[1]: *** [makefile:149: check] Error 1 > > > make: *** [GNUmakefile:17: check] Error 2 > > > ++++++++++++++++++++++++++++++++++++++++++++++++ > > > > > > > > > From knepley at gmail.com Tue Oct 11 09:54:58 2022 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 11 Oct 2022 10:54:58 -0400 Subject: [petsc-users] DMLocalToLocal with DMPlex in Fortran In-Reply-To: References: Message-ID: On Mon, Oct 10, 2022 at 10:41 PM Mike Michell wrote: > Hi, I was wondering if there is any comment on the example file that I can > refer to. > I see the problem. Local2Local is not implemented for Plex. I thought we had this automated, but it was only coded for DMDA. It is a fairly mechanical transformation of the Global2Local, just remapping indices, but it will take some time since there is a lot of backlog this semester. I have fixed the error message so now it is obvious what the problem is. Thanks, Matt > Thanks, > Mike > > >> Thank you for the reply. >> Sure, a short example code is attached here with a square box mesh and a >> run script. >> Inside the source, you may find two versions of halo exchange; one is for >> local to global (Version-1) and another one is for local to local >> (Version-2), which is not working in my case. >> In the output.vtu, you will see the halo exchanged vector resolved to >> each vertex with (myrank + 1), so if the code is running with 2procs, at >> the parallel boundary, you will see 3. In this example, there is no ghost >> layer. >> >> Thanks, >> Mike >> >> >>> On Sat, Oct 1, 2022 at 8:51 PM Mike Michell >>> wrote: >>> >>>> Thank you for the reply. There is that file in >>>> src/dm/interface/ftn-auto/ for me, instead of the path you mentioned. >>>> >>>> After "make allfortranstubs" was done and, PETSc reconfigured and >>>> reinstalled. >>>> >>>> However, I still have the same problem at the line in which >>>> DMLocalToLocalBegin() is used. What I am doing to setup halo exchange is as >>>> follows; >>>> - declare DMPlex >>>> - PetscSectionCreate() >>>> - PetscSectionSetNumFields() >>>> - PetscSectionSetFieldComponents() >>>> - PetscSectionSetChart() >>>> - do loop over dofs: PetscSectionSetDof() and PetscSectionSetFieldDof() >>>> - PetscSectionSetUp() >>>> - DMSetLocalSection() >>>> - PetscSectionDestroy() >>>> - DMGetSectionSF() >>>> - PetscSFSetUp() >>>> >>>> Then, the halo exchange is performed as follows; >>>> - DMGetLocalVector() >>>> - Fill the local vector >>>> - DMLocalToLocalBegin() --(*) >>>> - DMLocalToLocalEnd() >>>> - DMRestoreLocalVector() >>>> >>>> Then, the code crashes at (*). >>>> >>> >>> Can you send something I can run? Then I will find the problem and fix >>> it. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> Previously(at the time PETSc did not support LocalToLocal for DMPlex in >>>> Fortran), the part above, "DMLocalToLocalBegin() and DMLocalToLocalEnd()", >>>> consisted of; >>>> - DMLocalToGlobalBegin() >>>> - DMLocalToGlobalEnd() >>>> - DMGlobalToLocalBegin() >>>> - DMGlobalToLocalEnd() >>>> and it worked okay. >>>> >>>> I am unclear which part is causing the problem. Shall I define the >>>> PetscSection and PetscSF in different ways in case of Local to Local Halo >>>> exchange? >>>> Any comment will be appreciated. >>>> >>>> Thanks, >>>> Mike >>>> >>>> >>>> >>>>> On Fri, Sep 30, 2022 at 4:14 PM Mike Michell >>>>> wrote: >>>>> >>>>>> Hi, >>>>>> >>>>>> As a follow-up to this email thread, >>>>>> https://www.mail-archive.com/petsc-users at mcs.anl.gov/msg44070.html >>>>>> >>>>>> Are DMLocalToLocalBegin() and DMLocalToLocalEnd() really available >>>>>> for DMPlex with Fortran on the latest version of PETSc (3.17.99 from >>>>>> GitLab)? Matt commented that the Fortran bindings were updated so that >>>>>> those functions must be available in the latest version of PETSc, however, >>>>>> it seems still they are not working from my test with DMPlex in Fortran. >>>>>> Can anyone provide some comments? Probably I am missing some mandatory >>>>>> header file? Currently, I have headers; >>>>>> >>>>>> #include "petsc/finclude/petscvec.h" >>>>>> #include "petsc/finclude/petscdmplex.h" >>>>>> #include "petsc/finclude/petscdmlabel.h" >>>>>> #include "petsc/finclude/petscdm.h" >>>>>> >>>>> >>>>> The source for these functions is in >>>>> >>>>> src/dm/ftn-auto/dmf.c >>>>> >>>>> Is it there for you? If not, you can run >>>>> >>>>> make allfortranstubs >>>>> >>>>> Fortran functions are not declared, so the header should not matter >>>>> for compilation, just the libraries for linking. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Thanks, >>>>>> Mike >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mi.mike1021 at gmail.com Tue Oct 11 17:04:47 2022 From: mi.mike1021 at gmail.com (Mike Michell) Date: Wed, 12 Oct 2022 07:04:47 +0900 Subject: [petsc-users] DMLocalToLocal with DMPlex in Fortran In-Reply-To: References: Message-ID: Thank you for the reply and checking. Indeed, it seems that local-to-local halo is still not implemented for DMPlex. But I believe this is a very required feature for large 3D simulations with DMPlex. Would you mind if I ask for any estimated timeline to fix this issue and put it on the official version of PETSc? If I remember correctly, we had a similar discussion a few months ago. Thanks, Mike On Mon, Oct 10, 2022 at 10:41 PM Mike Michell wrote: > >> Hi, I was wondering if there is any comment on the example file that I >> can refer to. >> > > I see the problem. Local2Local is not implemented for Plex. I thought we > had this automated, but it was only > coded for DMDA. It is a fairly mechanical transformation of the > Global2Local, just remapping indices, but it > will take some time since there is a lot of backlog this semester. > > I have fixed the error message so now it is obvious what the problem is. > > Thanks, > > Matt > > >> Thanks, >> Mike >> >> >>> Thank you for the reply. >>> Sure, a short example code is attached here with a square box mesh and a >>> run script. >>> Inside the source, you may find two versions of halo exchange; one is >>> for local to global (Version-1) and another one is for local to local >>> (Version-2), which is not working in my case. >>> In the output.vtu, you will see the halo exchanged vector resolved to >>> each vertex with (myrank + 1), so if the code is running with 2procs, at >>> the parallel boundary, you will see 3. In this example, there is no ghost >>> layer. >>> >>> Thanks, >>> Mike >>> >>> >>>> On Sat, Oct 1, 2022 at 8:51 PM Mike Michell >>>> wrote: >>>> >>>>> Thank you for the reply. There is that file in >>>>> src/dm/interface/ftn-auto/ for me, instead of the path you mentioned. >>>>> >>>>> After "make allfortranstubs" was done and, PETSc reconfigured and >>>>> reinstalled. >>>>> >>>>> However, I still have the same problem at the line in which >>>>> DMLocalToLocalBegin() is used. What I am doing to setup halo exchange is as >>>>> follows; >>>>> - declare DMPlex >>>>> - PetscSectionCreate() >>>>> - PetscSectionSetNumFields() >>>>> - PetscSectionSetFieldComponents() >>>>> - PetscSectionSetChart() >>>>> - do loop over dofs: PetscSectionSetDof() and PetscSectionSetFieldDof() >>>>> - PetscSectionSetUp() >>>>> - DMSetLocalSection() >>>>> - PetscSectionDestroy() >>>>> - DMGetSectionSF() >>>>> - PetscSFSetUp() >>>>> >>>>> Then, the halo exchange is performed as follows; >>>>> - DMGetLocalVector() >>>>> - Fill the local vector >>>>> - DMLocalToLocalBegin() --(*) >>>>> - DMLocalToLocalEnd() >>>>> - DMRestoreLocalVector() >>>>> >>>>> Then, the code crashes at (*). >>>>> >>>> >>>> Can you send something I can run? Then I will find the problem and fix >>>> it. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> Previously(at the time PETSc did not support LocalToLocal for DMPlex >>>>> in Fortran), the part above, "DMLocalToLocalBegin() and >>>>> DMLocalToLocalEnd()", consisted of; >>>>> - DMLocalToGlobalBegin() >>>>> - DMLocalToGlobalEnd() >>>>> - DMGlobalToLocalBegin() >>>>> - DMGlobalToLocalEnd() >>>>> and it worked okay. >>>>> >>>>> I am unclear which part is causing the problem. Shall I define the >>>>> PetscSection and PetscSF in different ways in case of Local to Local Halo >>>>> exchange? >>>>> Any comment will be appreciated. >>>>> >>>>> Thanks, >>>>> Mike >>>>> >>>>> >>>>> >>>>>> On Fri, Sep 30, 2022 at 4:14 PM Mike Michell >>>>>> wrote: >>>>>> >>>>>>> Hi, >>>>>>> >>>>>>> As a follow-up to this email thread, >>>>>>> https://www.mail-archive.com/petsc-users at mcs.anl.gov/msg44070.html >>>>>>> >>>>>>> Are DMLocalToLocalBegin() and DMLocalToLocalEnd() really available >>>>>>> for DMPlex with Fortran on the latest version of PETSc (3.17.99 from >>>>>>> GitLab)? Matt commented that the Fortran bindings were updated so that >>>>>>> those functions must be available in the latest version of PETSc, however, >>>>>>> it seems still they are not working from my test with DMPlex in Fortran. >>>>>>> Can anyone provide some comments? Probably I am missing some mandatory >>>>>>> header file? Currently, I have headers; >>>>>>> >>>>>>> #include "petsc/finclude/petscvec.h" >>>>>>> #include "petsc/finclude/petscdmplex.h" >>>>>>> #include "petsc/finclude/petscdmlabel.h" >>>>>>> #include "petsc/finclude/petscdm.h" >>>>>>> >>>>>> >>>>>> The source for these functions is in >>>>>> >>>>>> src/dm/ftn-auto/dmf.c >>>>>> >>>>>> Is it there for you? If not, you can run >>>>>> >>>>>> make allfortranstubs >>>>>> >>>>>> Fortran functions are not declared, so the header should not matter >>>>>> for compilation, just the libraries for linking. >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> Thanks, >>>>>>> Mike >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>> >>>>>> >>>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zjorti at lanl.gov Tue Oct 11 18:45:20 2022 From: zjorti at lanl.gov (Jorti, Zakariae) Date: Tue, 11 Oct 2022 23:45:20 +0000 Subject: [petsc-users] VecScatter Message-ID: <8e43be06265f4feda18dc31de49471f1@lanl.gov> Hello, I have a code that handles a PETSc Vec on many procs and I would like to use VecScatterCreateToZero to have all elements of this vector on a single proc, call VecGetArrayRead on this proc to get the corresponding array to carry out some diagnostics. Unfortunately, what I noticed is that the ordering of the initial Vec is not preserved after the VecScatterCreateToZero call. Is there a way to have the same ordering for both Vecs? You will find below a minimal example that demonstrates the issue. Many thanks, Zakariae ------------------------------------------------------------- static char help[] = "Demonstrates ordering change after VecScatterCreateToZero call.\n\n"; #include #include #include int main(int argc,char **argv) { Vec xy; DM da; PetscErrorCode ierr; PetscInt m = 11, n = 11, dof = 2; PetscMPIInt rank; DMDACoor2d **coors; ierr = PetscInitialize(&argc,&argv,(char*)0,help);if (ierr) return ierr; ierr = DMDACreate2d(PETSC_COMM_WORLD, DM_BOUNDARY_NONE, DM_BOUNDARY_NONE,DMDA_STENCIL_BOX,m,n,PETSC_DECIDE,PETSC_DECIDE,dof,1,0,0,&da);CHKERRQ(ierr); ierr = DMSetFromOptions(da);CHKERRQ(ierr); ierr = DMSetUp(da);CHKERRQ(ierr); ierr = DMDASetUniformCoordinates(da,0.0,1.0,0.0,1.0,0.0,1.0);CHKERRQ(ierr); ierr = DMGetCoordinates(da,&xy);CHKERRQ(ierr); PetscInt i, j, ixs, ixm, iys, iym; MPI_Comm_rank(PETSC_COMM_WORLD, &rank); DMDAGetCorners(da, &ixs, &iys, 0, &ixm, &iym, 0); DMDAVecGetArray(da, xy, &coors); for (j = iys; j < iys + iym; j++) { for (i = ixs; i < ixs + ixm; i++) { PetscPrintf(PETSC_COMM_SELF, "rank=%d, %d, %d, (%g, %g)\n",rank, i, j,coors[j][i].x,coors[j][i].y); } } DMDAVecRestoreArray(da, xy, &coors); VecScatter scat; Vec Xseq; const PetscScalar *array; /* create scater to zero */ VecScatterCreateToZero(xy, &scat, &Xseq); VecScatterBegin(scat, xy, Xseq, INSERT_VALUES, SCATTER_FORWARD); VecScatterEnd(scat, xy, Xseq, INSERT_VALUES, SCATTER_FORWARD); if (rank == 0) { PetscInt sizeX; VecGetSize(Xseq, &sizeX); PetscPrintf(PETSC_COMM_SELF,"The size of Xseq is %d, and the grid size is %d\n",sizeX,11*11); VecGetArrayRead(Xseq, &array); for (j = 0; j < 11; j++) { for (i = 0; i < 11; i++) { PetscPrintf(PETSC_COMM_SELF, "%d,%d, (%g,%g)\n", i, j, (double)array[2*(j*11+i)], (double)array[1+2*(j*11+i)]); } } VecRestoreArrayRead(Xseq, &array); } /* Free work space. All PETSc objects should be destroyed when they are no longer needed. */ ierr = DMDestroy(&da);CHKERRQ(ierr); ierr = PetscFinalize(); return ierr; } -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Tue Oct 11 19:05:55 2022 From: bsmith at petsc.dev (Barry Smith) Date: Tue, 11 Oct 2022 20:05:55 -0400 Subject: [petsc-users] VecScatter In-Reply-To: <8e43be06265f4feda18dc31de49471f1@lanl.gov> References: <8e43be06265f4feda18dc31de49471f1@lanl.gov> Message-ID: <45C63138-FD71-4C0C-9FBB-9DE62B0D1B6B@petsc.dev> You need to first convert the Vec to the "natural" ordering and then bring that vector down to one rank. Something like DMDACreateNaturalVector(dm,&n); DMDAGlobalToNaturalBegin/End(dm, n); > VecScatterCreateToZero(n, &scat, &Xseq); > VecScatterBegin(scat, n Xseq, INSERT_VALUES, SCATTER_FORWARD); > VecScatterEnd(scat, n, Xseq, INSERT_VALUES, SCATTER_FORWARD); > On Oct 11, 2022, at 7:45 PM, Jorti, Zakariae via petsc-users wrote: > > Hello, > > I have a code that handles a PETSc Vec on many procs and I would like to use VecScatterCreateToZero to have all elements of this vector on a single proc, call VecGetArrayRead on this proc to get the corresponding array to carry out some diagnostics. > Unfortunately, what I noticed is that the ordering of the initial Vec is not preserved after the VecScatterCreateToZero call. Is there a way to have the same ordering for both Vecs? > You will find below a minimal example that demonstrates the issue. > Many thanks, > > Zakariae > > ------------------------------------------------------------- > > static char help[] = "Demonstrates ordering change after VecScatterCreateToZero call.\n\n"; > > > #include > #include > #include > > int main(int argc,char **argv) > { > Vec xy; > DM da; > PetscErrorCode ierr; > PetscInt m = 11, n = 11, dof = 2; > PetscMPIInt rank; > DMDACoor2d **coors; > > ierr = PetscInitialize(&argc,&argv,(char*)0,help);if (ierr) return ierr; > ierr = DMDACreate2d(PETSC_COMM_WORLD, DM_BOUNDARY_NONE, DM_BOUNDARY_NONE,DMDA_STENCIL_BOX,m,n,PETSC_DECIDE,PETSC_DECIDE,dof,1,0,0,&da);CHKERRQ(ierr); > ierr = DMSetFromOptions(da);CHKERRQ(ierr); > ierr = DMSetUp(da);CHKERRQ(ierr); > ierr = DMDASetUniformCoordinates(da,0.0,1.0,0.0,1.0,0.0,1.0);CHKERRQ(ierr); > ierr = DMGetCoordinates(da,&xy);CHKERRQ(ierr); > > PetscInt i, j, ixs, ixm, iys, iym; > MPI_Comm_rank(PETSC_COMM_WORLD, &rank); > > DMDAGetCorners(da, &ixs, &iys, 0, &ixm, &iym, 0); > DMDAVecGetArray(da, xy, &coors); > for (j = iys; j < iys + iym; j++) { > for (i = ixs; i < ixs + ixm; i++) { > PetscPrintf(PETSC_COMM_SELF, "rank=%d, %d, %d, (%g, %g)\n",rank, > i, j,coors[j][i].x,coors[j][i].y); > } > } > DMDAVecRestoreArray(da, xy, &coors); > > VecScatter scat; > Vec Xseq; > const PetscScalar *array; > > /* create scater to zero */ > VecScatterCreateToZero(xy, &scat, &Xseq); > VecScatterBegin(scat, xy, Xseq, INSERT_VALUES, SCATTER_FORWARD); > VecScatterEnd(scat, xy, Xseq, INSERT_VALUES, SCATTER_FORWARD); > > if (rank == 0) { > PetscInt sizeX; > VecGetSize(Xseq, &sizeX); > PetscPrintf(PETSC_COMM_SELF,"The size of Xseq is %d, and the grid size is %d\n",sizeX,11*11); > VecGetArrayRead(Xseq, &array); > > for (j = 0; j < 11; j++) { > for (i = 0; i < 11; i++) { > PetscPrintf(PETSC_COMM_SELF, "%d,%d, (%g,%g)\n", i, j, (double)array[2*(j*11+i)], (double)array[1+2*(j*11+i)]); > } > } > VecRestoreArrayRead(Xseq, &array); > } > > /* > Free work space. All PETSc objects should be destroyed when they > are no longer needed. > */ > ierr = DMDestroy(&da);CHKERRQ(ierr); > ierr = PetscFinalize(); > return ierr; > } -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Oct 12 09:44:26 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 12 Oct 2022 10:44:26 -0400 Subject: [petsc-users] DMLocalToLocal with DMPlex in Fortran In-Reply-To: References: Message-ID: On Tue, Oct 11, 2022 at 6:04 PM Mike Michell wrote: > Thank you for the reply and checking. > Indeed, it seems that local-to-local halo is still not implemented for > DMPlex. But I believe this is a very required feature for large 3D > simulations with DMPlex. > It is possible that it will make a difference. An explicit code, running completely from local vectors, which assembles its own residuals without mapping to global vectors, could probably realize some gain. It would be interesting to see the numbers. > Would you mind if I ask for any estimated timeline to fix this issue and > put it on the official version of PETSc? If I remember correctly, we had a > similar discussion a few months ago. > I cannot do it until November since we have a big review coming up. Thanks, Matt > Thanks, > Mike > > On Mon, Oct 10, 2022 at 10:41 PM Mike Michell >> wrote: >> >>> Hi, I was wondering if there is any comment on the example file that I >>> can refer to. >>> >> >> I see the problem. Local2Local is not implemented for Plex. I thought we >> had this automated, but it was only >> coded for DMDA. It is a fairly mechanical transformation of the >> Global2Local, just remapping indices, but it >> will take some time since there is a lot of backlog this semester. >> >> I have fixed the error message so now it is obvious what the problem is. >> >> Thanks, >> >> Matt >> >> >>> Thanks, >>> Mike >>> >>> >>>> Thank you for the reply. >>>> Sure, a short example code is attached here with a square box mesh and >>>> a run script. >>>> Inside the source, you may find two versions of halo exchange; one is >>>> for local to global (Version-1) and another one is for local to local >>>> (Version-2), which is not working in my case. >>>> In the output.vtu, you will see the halo exchanged vector resolved to >>>> each vertex with (myrank + 1), so if the code is running with 2procs, at >>>> the parallel boundary, you will see 3. In this example, there is no ghost >>>> layer. >>>> >>>> Thanks, >>>> Mike >>>> >>>> >>>>> On Sat, Oct 1, 2022 at 8:51 PM Mike Michell >>>>> wrote: >>>>> >>>>>> Thank you for the reply. There is that file in >>>>>> src/dm/interface/ftn-auto/ for me, instead of the path you mentioned. >>>>>> >>>>>> After "make allfortranstubs" was done and, PETSc reconfigured and >>>>>> reinstalled. >>>>>> >>>>>> However, I still have the same problem at the line in which >>>>>> DMLocalToLocalBegin() is used. What I am doing to setup halo exchange is as >>>>>> follows; >>>>>> - declare DMPlex >>>>>> - PetscSectionCreate() >>>>>> - PetscSectionSetNumFields() >>>>>> - PetscSectionSetFieldComponents() >>>>>> - PetscSectionSetChart() >>>>>> - do loop over dofs: PetscSectionSetDof() and >>>>>> PetscSectionSetFieldDof() >>>>>> - PetscSectionSetUp() >>>>>> - DMSetLocalSection() >>>>>> - PetscSectionDestroy() >>>>>> - DMGetSectionSF() >>>>>> - PetscSFSetUp() >>>>>> >>>>>> Then, the halo exchange is performed as follows; >>>>>> - DMGetLocalVector() >>>>>> - Fill the local vector >>>>>> - DMLocalToLocalBegin() --(*) >>>>>> - DMLocalToLocalEnd() >>>>>> - DMRestoreLocalVector() >>>>>> >>>>>> Then, the code crashes at (*). >>>>>> >>>>> >>>>> Can you send something I can run? Then I will find the problem and fix >>>>> it. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Previously(at the time PETSc did not support LocalToLocal for DMPlex >>>>>> in Fortran), the part above, "DMLocalToLocalBegin() and >>>>>> DMLocalToLocalEnd()", consisted of; >>>>>> - DMLocalToGlobalBegin() >>>>>> - DMLocalToGlobalEnd() >>>>>> - DMGlobalToLocalBegin() >>>>>> - DMGlobalToLocalEnd() >>>>>> and it worked okay. >>>>>> >>>>>> I am unclear which part is causing the problem. Shall I define the >>>>>> PetscSection and PetscSF in different ways in case of Local to Local Halo >>>>>> exchange? >>>>>> Any comment will be appreciated. >>>>>> >>>>>> Thanks, >>>>>> Mike >>>>>> >>>>>> >>>>>> >>>>>>> On Fri, Sep 30, 2022 at 4:14 PM Mike Michell >>>>>>> wrote: >>>>>>> >>>>>>>> Hi, >>>>>>>> >>>>>>>> As a follow-up to this email thread, >>>>>>>> https://www.mail-archive.com/petsc-users at mcs.anl.gov/msg44070.html >>>>>>>> >>>>>>>> Are DMLocalToLocalBegin() and DMLocalToLocalEnd() really available >>>>>>>> for DMPlex with Fortran on the latest version of PETSc (3.17.99 from >>>>>>>> GitLab)? Matt commented that the Fortran bindings were updated so that >>>>>>>> those functions must be available in the latest version of PETSc, however, >>>>>>>> it seems still they are not working from my test with DMPlex in Fortran. >>>>>>>> Can anyone provide some comments? Probably I am missing some mandatory >>>>>>>> header file? Currently, I have headers; >>>>>>>> >>>>>>>> #include "petsc/finclude/petscvec.h" >>>>>>>> #include "petsc/finclude/petscdmplex.h" >>>>>>>> #include "petsc/finclude/petscdmlabel.h" >>>>>>>> #include "petsc/finclude/petscdm.h" >>>>>>>> >>>>>>> >>>>>>> The source for these functions is in >>>>>>> >>>>>>> src/dm/ftn-auto/dmf.c >>>>>>> >>>>>>> Is it there for you? If not, you can run >>>>>>> >>>>>>> make allfortranstubs >>>>>>> >>>>>>> Fortran functions are not declared, so the header should not matter >>>>>>> for compilation, just the libraries for linking. >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> Thanks, >>>>>>>> Mike >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>> >>>>>>> >>>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tt73 at njit.edu Wed Oct 12 10:03:27 2022 From: tt73 at njit.edu (Takahashi, Tadanaga) Date: Wed, 12 Oct 2022 11:03:27 -0400 Subject: [petsc-users] How to get total subsnes iterations Message-ID: Hi. I am using the snes nasm for the global solver and snes newtonls for the local subdomain solver. I am trying to get the total number of Newton iterations for just one subdomain. I've tried: SNESNASMGetSNES(snes,0,&subsnes); SNESSolve(snes,NULL,u_initial); SNESGetNumberFunctionEvals(subsnes,&Newt_its); but this just gets me the number of Newton iterations just on the final nasm iteration. If I understand correctly, the information in the subsnes is repeatedly destroyed while the SNESSolve is running. Is there any way to extract the total subsnes iterations after the SNESSolve? If not, how would I extract the information? -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Oct 12 10:14:59 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 12 Oct 2022 11:14:59 -0400 Subject: [petsc-users] How to get total subsnes iterations In-Reply-To: References: Message-ID: On Wed, Oct 12, 2022 at 11:04 AM Takahashi, Tadanaga wrote: > Hi. I am using the snes nasm for the global solver and snes newtonls for > the local subdomain solver. I am trying to get the total number of Newton > iterations for just one subdomain. I've tried: > > SNESNASMGetSNES(snes,0,&subsnes); > SNESSolve(snes,NULL,u_initial); > SNESGetNumberFunctionEvals(subsnes,&Newt_its); > Here I think you want https://petsc.org/main/docs/manualpages/SNES/SNESGetIterationNumber/ > but this just gets me the number of Newton iterations just on the final > nasm iteration. If I understand correctly, the information in the subsnes > is repeatedly destroyed while the SNESSolve is running. Is there any way to > extract the total subsnes iterations after the SNESSolve? If not, how would > I extract the information? > You are correct. This is an oversight by us. We need to keep track of this in the same way we track the number of linear iterations in SNES. Can you submit an issue for this? I can do it in November. Thanks, Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mi.mike1021 at gmail.com Wed Oct 12 11:02:18 2022 From: mi.mike1021 at gmail.com (Mike Michell) Date: Thu, 13 Oct 2022 01:02:18 +0900 Subject: [petsc-users] DMLocalToLocal with DMPlex in Fortran In-Reply-To: References: Message-ID: Thanks a lot for the reply. It is possible that it will make a difference. An explicit code, running completely from local vectors, which assembles its own residuals without mapping to global vectors, could probably realize some gain. It would be interesting to see the numbers. => Agree with this. Even with an implicit scheme, I need the halo exchange of local-to-local while constructing the local matrix object (for example, summation over control volume components). Thus I expect some computational gain with this. In December, will try to remind you via this email chain. Thanks, Mike > On Tue, Oct 11, 2022 at 6:04 PM Mike Michell > wrote: > >> Thank you for the reply and checking. >> Indeed, it seems that local-to-local halo is still not implemented for >> DMPlex. But I believe this is a very required feature for large 3D >> simulations with DMPlex. >> > > It is possible that it will make a difference. An explicit code, running > completely from local vectors, which assembles its own residuals without > mapping to global vectors, > could probably realize some gain. It would be interesting to see the > numbers. > > >> Would you mind if I ask for any estimated timeline to fix this issue and >> put it on the official version of PETSc? If I remember correctly, we had a >> similar discussion a few months ago. >> > > I cannot do it until November since we have a big review coming up. > > Thanks, > > Matt > > >> Thanks, >> Mike >> >> On Mon, Oct 10, 2022 at 10:41 PM Mike Michell >>> wrote: >>> >>>> Hi, I was wondering if there is any comment on the example file that I >>>> can refer to. >>>> >>> >>> I see the problem. Local2Local is not implemented for Plex. I thought we >>> had this automated, but it was only >>> coded for DMDA. It is a fairly mechanical transformation of the >>> Global2Local, just remapping indices, but it >>> will take some time since there is a lot of backlog this semester. >>> >>> I have fixed the error message so now it is obvious what the problem is. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> Thanks, >>>> Mike >>>> >>>> >>>>> Thank you for the reply. >>>>> Sure, a short example code is attached here with a square box mesh and >>>>> a run script. >>>>> Inside the source, you may find two versions of halo exchange; one is >>>>> for local to global (Version-1) and another one is for local to local >>>>> (Version-2), which is not working in my case. >>>>> In the output.vtu, you will see the halo exchanged vector resolved to >>>>> each vertex with (myrank + 1), so if the code is running with 2procs, at >>>>> the parallel boundary, you will see 3. In this example, there is no ghost >>>>> layer. >>>>> >>>>> Thanks, >>>>> Mike >>>>> >>>>> >>>>>> On Sat, Oct 1, 2022 at 8:51 PM Mike Michell >>>>>> wrote: >>>>>> >>>>>>> Thank you for the reply. There is that file in >>>>>>> src/dm/interface/ftn-auto/ for me, instead of the path you mentioned. >>>>>>> >>>>>>> After "make allfortranstubs" was done and, PETSc reconfigured and >>>>>>> reinstalled. >>>>>>> >>>>>>> However, I still have the same problem at the line in which >>>>>>> DMLocalToLocalBegin() is used. What I am doing to setup halo exchange is as >>>>>>> follows; >>>>>>> - declare DMPlex >>>>>>> - PetscSectionCreate() >>>>>>> - PetscSectionSetNumFields() >>>>>>> - PetscSectionSetFieldComponents() >>>>>>> - PetscSectionSetChart() >>>>>>> - do loop over dofs: PetscSectionSetDof() and >>>>>>> PetscSectionSetFieldDof() >>>>>>> - PetscSectionSetUp() >>>>>>> - DMSetLocalSection() >>>>>>> - PetscSectionDestroy() >>>>>>> - DMGetSectionSF() >>>>>>> - PetscSFSetUp() >>>>>>> >>>>>>> Then, the halo exchange is performed as follows; >>>>>>> - DMGetLocalVector() >>>>>>> - Fill the local vector >>>>>>> - DMLocalToLocalBegin() --(*) >>>>>>> - DMLocalToLocalEnd() >>>>>>> - DMRestoreLocalVector() >>>>>>> >>>>>>> Then, the code crashes at (*). >>>>>>> >>>>>> >>>>>> Can you send something I can run? Then I will find the problem and >>>>>> fix it. >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> Previously(at the time PETSc did not support LocalToLocal for DMPlex >>>>>>> in Fortran), the part above, "DMLocalToLocalBegin() and >>>>>>> DMLocalToLocalEnd()", consisted of; >>>>>>> - DMLocalToGlobalBegin() >>>>>>> - DMLocalToGlobalEnd() >>>>>>> - DMGlobalToLocalBegin() >>>>>>> - DMGlobalToLocalEnd() >>>>>>> and it worked okay. >>>>>>> >>>>>>> I am unclear which part is causing the problem. Shall I define the >>>>>>> PetscSection and PetscSF in different ways in case of Local to Local Halo >>>>>>> exchange? >>>>>>> Any comment will be appreciated. >>>>>>> >>>>>>> Thanks, >>>>>>> Mike >>>>>>> >>>>>>> >>>>>>> >>>>>>>> On Fri, Sep 30, 2022 at 4:14 PM Mike Michell >>>>>>>> wrote: >>>>>>>> >>>>>>>>> Hi, >>>>>>>>> >>>>>>>>> As a follow-up to this email thread, >>>>>>>>> https://www.mail-archive.com/petsc-users at mcs.anl.gov/msg44070.html >>>>>>>>> >>>>>>>>> Are DMLocalToLocalBegin() and DMLocalToLocalEnd() really available >>>>>>>>> for DMPlex with Fortran on the latest version of PETSc (3.17.99 from >>>>>>>>> GitLab)? Matt commented that the Fortran bindings were updated so that >>>>>>>>> those functions must be available in the latest version of PETSc, however, >>>>>>>>> it seems still they are not working from my test with DMPlex in Fortran. >>>>>>>>> Can anyone provide some comments? Probably I am missing some mandatory >>>>>>>>> header file? Currently, I have headers; >>>>>>>>> >>>>>>>>> #include "petsc/finclude/petscvec.h" >>>>>>>>> #include "petsc/finclude/petscdmplex.h" >>>>>>>>> #include "petsc/finclude/petscdmlabel.h" >>>>>>>>> #include "petsc/finclude/petscdm.h" >>>>>>>>> >>>>>>>> >>>>>>>> The source for these functions is in >>>>>>>> >>>>>>>> src/dm/ftn-auto/dmf.c >>>>>>>> >>>>>>>> Is it there for you? If not, you can run >>>>>>>> >>>>>>>> make allfortranstubs >>>>>>>> >>>>>>>> Fortran functions are not declared, so the header should not matter >>>>>>>> for compilation, just the libraries for linking. >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> Mike >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>> >>>>>> >>>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From aduarteg at utexas.edu Wed Oct 12 12:37:58 2022 From: aduarteg at utexas.edu (Alfredo J Duarte Gomez) Date: Wed, 12 Oct 2022 12:37:58 -0500 Subject: [petsc-users] Laplace Equation preconditioner Message-ID: Good morning PETSC users, I have a current solver that requires the solution of a Laplace equation to be reused for all future time steps. The configuration is axisymmetric with Dirichlet BCs at the top and bottom boundaries, and Zero Neumman conditions at the axis and far field. The grid is curvilinear and structured. So far I have been using PCHYPRE boomeramg as the preconditioner, which often works well, but I have also experienced DIVERGED_BREAKDOWN on many occasions. When I use direct solver PCMUMPS it always produces a satisfactory answer, which gives me confidence that the solution exists for the given grid in all these configurations where boomeramg fails. I am looking for recommendations on other preconditioners to try in this problem that can produce a solution faster than PCMUMPS, or recommendations on which parameters to adjust on boomeramg. Thank you, -- Alfredo Duarte Graduate Research Assistant The University of Texas at Austin -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Oct 12 13:00:56 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 12 Oct 2022 14:00:56 -0400 Subject: [petsc-users] Laplace Equation preconditioner In-Reply-To: References: Message-ID: On Wed, Oct 12, 2022 at 1:38 PM Alfredo J Duarte Gomez wrote: > Good morning PETSC users, > > I have a current solver that requires the solution of a Laplace equation > to be reused for all future time steps. > > The configuration is axisymmetric with Dirichlet BCs at the top and bottom > boundaries, and Zero Neumman conditions at the axis and far field. The grid > is curvilinear and structured. > > So far I have been using PCHYPRE boomeramg as the preconditioner, which > often works well, but I have also experienced DIVERGED_BREAKDOWN on many > occasions. When I use direct solver PCMUMPS it always produces a > satisfactory answer, which gives me confidence that the solution exists for > the given grid in all these configurations where boomeramg fails. > I do not know why Hypre is breaking down. Did you try ML or GAMG? They are easier to diagnose I think. Thanks, Matt > I am looking for recommendations on other preconditioners to try in this > problem that can produce a solution faster than PCMUMPS, or recommendations > on which parameters to adjust on boomeramg. > > Thank you, > > > -- > Alfredo Duarte > Graduate Research Assistant > The University of Texas at Austin > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Wed Oct 12 13:32:43 2022 From: bsmith at petsc.dev (Barry Smith) Date: Wed, 12 Oct 2022 14:32:43 -0400 Subject: [petsc-users] Laplace Equation preconditioner In-Reply-To: References: Message-ID: What KSP are you using? DIVERGED_BREAKDOWN is very rare for KSPGMRES. If you are using one of its lesser cousins like bcgs you might consider switching to bcgsl or gmres. I assume because of boundary conditions or the discretization you do not have symmetric positive definite and thus cannot use CG. Barry > On Oct 12, 2022, at 2:00 PM, Matthew Knepley wrote: > > On Wed, Oct 12, 2022 at 1:38 PM Alfredo J Duarte Gomez > wrote: > Good morning PETSC users, > > I have a current solver that requires the solution of a Laplace equation to be reused for all future time steps. > > The configuration is axisymmetric with Dirichlet BCs at the top and bottom boundaries, and Zero Neumman conditions at the axis and far field. The grid is curvilinear and structured. > > So far I have been using PCHYPRE boomeramg as the preconditioner, which often works well, but I have also experienced DIVERGED_BREAKDOWN on many occasions. When I use direct solver PCMUMPS it always produces a satisfactory answer, which gives me confidence that the solution exists for the given grid in all these configurations where boomeramg fails. > > I do not know why Hypre is breaking down. Did you try ML or GAMG? They are easier to diagnose I think. > > Thanks, > > Matt > > I am looking for recommendations on other preconditioners to try in this problem that can produce a solution faster than PCMUMPS, or recommendations on which parameters to adjust on boomeramg. > > Thank you, > > > -- > Alfredo Duarte > Graduate Research Assistant > The University of Texas at Austin > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From psun at outlook.com Wed Oct 12 18:13:42 2022 From: psun at outlook.com (Peng Sun) Date: Wed, 12 Oct 2022 23:13:42 +0000 Subject: [petsc-users] Issue with single precision complex numbers in petsc4py Message-ID: Dear PETSc community, I have a question regarding the single precision complex numbers of petsc4py. I configured PETSc with the ?--with-scalar-type=complex --with-precision=single" option before compiling, but all the DA structures I created with petsc4py had double precision. Here is a minimum test code on Python 3.8/PETSc 3.12/petsc4py 3.12: both print commands show data type of complex128. Could anybody please help me? Thanks! import petsc4py import sys petsc4py.init(sys.argv) from petsc4py import PETSc da=PETSc.DA().create(sizes=[2,2,2],dof=1,stencil_type=0,stencil_width=1,boundary_type=1) da_1 = da.createGlobalVec() print(petsc4py.PETSc.ComplexType) print(da_1.getArray().dtype) Best regards, Peng Sun -------------- next part -------------- An HTML attachment was scrubbed... URL: From snailsoar at hotmail.com Thu Oct 13 04:52:59 2022 From: snailsoar at hotmail.com (feng wang) Date: Thu, 13 Oct 2022 09:52:59 +0000 Subject: [petsc-users] Slepc, shell matrix, parallel, halo exchange In-Reply-To: References: <53363D7B-CCBD-4DAB-924E-1D5D56975828@dsic.upv.es> <76162134-CDE9-42B9-8310-D9DD33D2F12D@dsic.upv.es> Message-ID: Hi Mat, Yes, you are right. I have tried both ways and they all work fine. the code snippet in previous post is fine. I had some issue with other parts of the code, that led to the unexpected results. Thanks, Feng ________________________________ From: Matthew Knepley Sent: 10 October 2022 15:52 To: feng wang Cc: Jose E. Roman ; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange On Mon, Oct 10, 2022 at 11:42 AM feng wang > wrote: Hi Mat, Thanks for your reply. It seems I have to use "VecSetValues" to assign the values of my ghost vector "petsc_dcsv". and then call VecAssemblyBegin/End. If I do it this way, the ghost cells are exchanged correctly. This should only be true if you are modifying off-process values. If not, that does not seem right. Thanks, Matt Besides, I notice that, when I run my code sequentially or with multiple processors, the produced eigenvalues are similar, but the number of iterations are different to reach the specified "-eps_tol" and the relative residuals are also slightly different. Is this normal? I am using the default Krylov-Schur solver and double precision. Thanks, Feng ________________________________ From: Matthew Knepley > Sent: 09 October 2022 12:11 To: feng wang > Cc: Jose E. Roman >; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange On Fri, Oct 7, 2022 at 5:48 PM feng wang > wrote: Hi Mat, I've tried the suggested approach. The halo cells are not exchanged somehow. Below is how I do it, have I missed anything? I create a ghost vector petsc_dcsv and it is a data member of the class cFdDomain, which is a context of the shell matrix. PetscCall(VecCreateGhostBlock(*A_COMM_WORLD, blocksize, blocksize*nlocal, PETSC_DECIDE ,nghost, ighost, &petsc_dcsv)); blocksize and nv have the same value. nlocal is number of local cells and nghost is number of halo cells. ighost contains the ghost cell index. Below is how I compute a matrix-vector product with a shell matrix PetscErrorCode cFdDomain::mymult_slepc(Mat m ,Vec x, Vec y) { void *ctx; cFdDomain *myctx; PetscErrorCode ierr; MatShellGetContext(m, &ctx); myctx = (cFdDomain*)ctx; //matrix-vector product ierr = myctx->myfunc(x, y); CHKERRQ(ierr); ierr = 0; return ierr; } PetscErrorCode cFdDomain::myfunc(Vec in, Vec out) { //some declaration ierr = VecGetArray(petsc_dcsv,&array_g); CHKERRQ(ierr); ierr = VecGetArrayRead(in, &array); CHKERRQ(ierr); //assign in to petsc_dcsv, only local cells for(iv=0; iv> Sent: 21 September 2022 14:36 To: feng wang > Cc: Jose E. Roman >; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange On Wed, Sep 21, 2022 at 10:35 AM feng wang > wrote: Hi Jose, For your 2nd suggestion on halo exchange, I get the idea and roughly know how to do it, but there are some implementation details which I am not quite sure. If I understand it correctly, in MatMult(Mat m ,Vec x, Vec y), Vec x is a normal parallel vector and it does not contain halo values. Suppose I create an auxiliary ghost vector x_g, then I assign the values of x to x_g. The values of the halo for each partition will not be assigned at this stage. But If I call VecGhostUpdateBegin/End(x_g, INSERT_VALUES, SCATTER_FORWARD), this will fill the values of the halo cells of x_g for each partition. Then x_g has local and halo cells assigned correctly and I can use x_g to do my computation. Is this what you mean? Yes Matt Thanks, Feng ________________________________ From: Jose E. Roman > Sent: 21 September 2022 13:07 To: feng wang > Cc: Matthew Knepley >; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > El 21 sept 2022, a las 14:47, feng wang > escribi?: > > Thanks Jose, I will try this and will come back to this thread if I have any issue. > > Besides, for EPSGetEigenpair, I guess each rank gets its portion of the eigenvector, and I need to put them together afterwards? Eigenvectors are stored in parallel vectors, which are used in subsequent parallel computation in most applications. If for some reason you need to gather them in a single MPI process you can use e.g. VecScatterCreateToZero() > > Thanks, > Feng > > From: Jose E. Roman > > Sent: 21 September 2022 12:34 > To: feng wang > > Cc: Matthew Knepley >; petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > If you define the MATOP_CREATE_VECS operation in your shell matrix so that it creates a ghost vector, then all vectors within EPS will be ghost vectors, including those that are received as arguments of MatMult(). Not sure if this will work. > > A simpler solution is that you store a ghost vector in the context of your shell matrix, and then in MatMult() you receive a regular parallel vector x, then update the ghost points using the auxiliary ghost vector, do the computation and store the result in the regular parallel vector y. > > Jose > > > > El 21 sept 2022, a las 14:09, feng wang > escribi?: > > > > Thanks for your reply. > > > > For GMRES, I create a ghost vector and give it to KSPSolve. For Slepc, it only takes the shell matrix for EPSSetOperators. Suppose the shell matrix of the eigensolver defines MatMult(Mat m ,Vec x, Vec y), how does it know Vec x is a ghost vector and how many ghost cells there are? > > > > Thanks, > > Feng > > From: Matthew Knepley > > > Sent: 21 September 2022 11:58 > > To: feng wang > > > Cc: petsc-users at mcs.anl.gov > > > Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange > > > > On Wed, Sep 21, 2022 at 7:41 AM feng wang > wrote: > > Hello, > > > > I am using Slepc with a shell matrix. The sequential version seems working and now I am trying to make it run in parallel. > > > > The partition of the domain is done, I am not sure how to do the halo exchange in the shell matrix in Slepc. I have a parallel version of matrix-free GMRES in my code with Petsc. I was using VecCreateGhostBlock to create vector with ghost cells, and then used VecGhostUpdateBegin/End for the halo exchange in the shell matrix, would this be the same for Slepc? > > > > That will be enough for the MatMult(). You would also have to use a SLEPc EPS that only needed MatMult(). > > > > Thanks, > > > > Matt > > > > Thanks, > > Feng > > > > > > > > > > -- > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Oct 13 08:34:20 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 13 Oct 2022 09:34:20 -0400 Subject: [petsc-users] Issue with single precision complex numbers in petsc4py In-Reply-To: References: Message-ID: First send configure.log so we can see the setup. Thanks, Matt On Thu, Oct 13, 2022 at 12:53 AM Peng Sun wrote: > Dear PETSc community, > > > I have a question regarding the single precision complex numbers of > petsc4py. I configured PETSc with the ?--with-scalar-type=complex > --with-precision=single" option before compiling, but all the DA structures > I created with petsc4py had double precision. > > > Here is a minimum test code on Python 3.8/PETSc 3.12/petsc4py 3.12: both > print commands show data type of complex128. Could anybody please help > me? Thanks! > > > import petsc4pyimport sys > petsc4py.init(sys.argv)from petsc4py import PETSc > > da=PETSc.DA().create(sizes=[2,2,2],dof=1,stencil_type=0,stencil_width=1,boundary_type=1) > da_1 = da.createGlobalVec()print(petsc4py.PETSc.ComplexType)print(da_1.getArray().dtype) > > > > > Best regards, > > Peng Sun > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From psun at outlook.com Thu Oct 13 10:42:56 2022 From: psun at outlook.com (Peng Sun) Date: Thu, 13 Oct 2022 15:42:56 +0000 Subject: [petsc-users] Issue with single precision complex numbers in petsc4py In-Reply-To: References: Message-ID: Hi Matt, Sure, please see the attached configure.log file. Thanks! Best regards, Peng Sun ________________________________ From: Matthew Knepley Sent: Thursday, October 13, 2022 6:34 AM To: Peng Sun Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py First send configure.log so we can see the setup. Thanks, Matt On Thu, Oct 13, 2022 at 12:53 AM Peng Sun > wrote: Dear PETSc community, I have a question regarding the single precision complex numbers of petsc4py. I configured PETSc with the ?--with-scalar-type=complex --with-precision=single" option before compiling, but all the DA structures I created with petsc4py had double precision. Here is a minimum test code on Python 3.8/PETSc 3.12/petsc4py 3.12: both print commands show data type of complex128. Could anybody please help me? Thanks! import petsc4py import sys petsc4py.init(sys.argv) from petsc4py import PETSc da=PETSc.DA().create(sizes=[2,2,2],dof=1,stencil_type=0,stencil_width=1,boundary_type=1) da_1 = da.createGlobalVec() print(petsc4py.PETSc.ComplexType) print(da_1.getArray().dtype) Best regards, Peng Sun -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: text/x-log Size: 766230 bytes Desc: configure.log URL: From aduarteg at utexas.edu Thu Oct 13 11:39:51 2022 From: aduarteg at utexas.edu (Alfredo J Duarte Gomez) Date: Thu, 13 Oct 2022 11:39:51 -0500 Subject: [petsc-users] Laplace Equation preconditioner In-Reply-To: References: Message-ID: Hello, I am using KSPGMRES. I was surprised to see DIVERGED_BREAKDOWN as well, which is why I thought there could be some grid issues. However, other preconditioners were able to retrieve a satisfactory solution (PCGAMG, PCGASM, PCLU MUMPS). Is there maybe a problem size for which boomerAMG has decreased performance? I am dealing with problem sizes of 2 million to 20 million points for reference. I tried out PCGAMG and it seems to be working much better than boomerAMG, so thank you so much for your suggestion. Sincerely, -Alfredo On Wed, Oct 12, 2022 at 1:32 PM Barry Smith wrote: > > What KSP are you using? DIVERGED_BREAKDOWN is very rare for KSPGMRES. > If you are using one of its lesser cousins like bcgs you might consider > switching to bcgsl or gmres. > > I assume because of boundary conditions or the discretization you do > not have symmetric positive definite and thus cannot use CG. > > Barry > > > On Oct 12, 2022, at 2:00 PM, Matthew Knepley wrote: > > On Wed, Oct 12, 2022 at 1:38 PM Alfredo J Duarte Gomez < > aduarteg at utexas.edu> wrote: > >> Good morning PETSC users, >> >> I have a current solver that requires the solution of a Laplace equation >> to be reused for all future time steps. >> >> The configuration is axisymmetric with Dirichlet BCs at the top and >> bottom boundaries, and Zero Neumman conditions at the axis and far field. >> The grid is curvilinear and structured. >> >> So far I have been using PCHYPRE boomeramg as the preconditioner, which >> often works well, but I have also experienced DIVERGED_BREAKDOWN on many >> occasions. When I use direct solver PCMUMPS it always produces a >> satisfactory answer, which gives me confidence that the solution exists for >> the given grid in all these configurations where boomeramg fails. >> > > I do not know why Hypre is breaking down. Did you try ML or GAMG? They are > easier to diagnose I think. > > Thanks, > > Matt > > >> I am looking for recommendations on other preconditioners to try in this >> problem that can produce a solution faster than PCMUMPS, or recommendations >> on which parameters to adjust on boomeramg. >> >> Thank you, >> >> >> -- >> Alfredo Duarte >> Graduate Research Assistant >> The University of Texas at Austin >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > -- Alfredo Duarte Graduate Research Assistant The University of Texas at Austin -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Oct 13 15:23:01 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 13 Oct 2022 16:23:01 -0400 Subject: [petsc-users] Issue with single precision complex numbers in petsc4py In-Reply-To: References: Message-ID: Lisandro, PETSc is compiled for single. Does petsc4py respect this, or does it always use double for getArray() and friends? Thanks, Matt On Thu, Oct 13, 2022 at 11:42 AM Peng Sun wrote: > Hi Matt, > > Sure, please see the attached configure.log file. Thanks! > > Best regards, > Peng Sun > > > ------------------------------ > *From:* Matthew Knepley > *Sent:* Thursday, October 13, 2022 6:34 AM > *To:* Peng Sun > *Cc:* petsc-users at mcs.anl.gov > *Subject:* Re: [petsc-users] Issue with single precision complex numbers > in petsc4py > > First send configure.log so we can see the setup. > > Thanks, > > Matt > > On Thu, Oct 13, 2022 at 12:53 AM Peng Sun wrote: > > Dear PETSc community, > > > I have a question regarding the single precision complex numbers of > petsc4py. I configured PETSc with the ?--with-scalar-type=complex > --with-precision=single" option before compiling, but all the DA structures > I created with petsc4py had double precision. > > > Here is a minimum test code on Python 3.8/PETSc 3.12/petsc4py 3.12: both > print commands show data type of complex128. Could anybody please help > me? Thanks! > > > import petsc4pyimport sys > petsc4py.init(sys.argv)from petsc4py import PETSc > > da=PETSc.DA().create(sizes=[2,2,2],dof=1,stencil_type=0,stencil_width=1,boundary_type=1) > da_1 = da.createGlobalVec()print(petsc4py.PETSc.ComplexType)print(da_1.getArray().dtype) > > > > > Best regards, > > Peng Sun > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.zampini at gmail.com Thu Oct 13 15:57:24 2022 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Thu, 13 Oct 2022 23:57:24 +0300 Subject: [petsc-users] Issue with single precision complex numbers in petsc4py In-Reply-To: References: Message-ID: <4DEAFEB7-0ABF-48C5-A893-3B75C2BC6589@gmail.com> Matt Yes, petsc4py does the right thing. This is probably. Picking up the wrong PETSc arch. Peng, can you please run this? import petsc4py petsc4py.init() print(petsc4py.get_config()) > On Oct 13, 2022, at 11:23 PM, Matthew Knepley wrote: > > Lisandro, > > PETSc is compiled for single. Does petsc4py respect this, or does it always use double for getArray() and friends? > > Thanks, > > Matt > > On Thu, Oct 13, 2022 at 11:42 AM Peng Sun > wrote: > Hi Matt, > > Sure, please see the attached configure.log file. Thanks! > > Best regards, > Peng Sun > > > From: Matthew Knepley > > Sent: Thursday, October 13, 2022 6:34 AM > To: Peng Sun > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py > > First send configure.log so we can see the setup. > > Thanks, > > Matt > > On Thu, Oct 13, 2022 at 12:53 AM Peng Sun > wrote: > Dear PETSc community, > > > > I have a question regarding the single precision complex numbers of petsc4py. I configured PETSc with the ?--with-scalar-type=complex --with-precision=single" option before compiling, but all the DA structures I created with petsc4py had double precision. > > > > Here is a minimum test code on Python 3.8/PETSc 3.12/petsc4py 3.12: both print commands show data type of complex128. Could anybody please help me? Thanks! > > > > import petsc4py > import sys > petsc4py.init(sys.argv) > from petsc4py import PETSc > > da=PETSc.DA().create(sizes=[2,2,2],dof=1,stencil_type=0,stencil_width=1,boundary_type=1) > da_1 = da.createGlobalVec() > print(petsc4py.PETSc.ComplexType) > print(da_1.getArray().dtype) > > > > > > Best regards, > > Peng Sun > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From psun at outlook.com Thu Oct 13 16:27:34 2022 From: psun at outlook.com (Peng Sun) Date: Thu, 13 Oct 2022 21:27:34 +0000 Subject: [petsc-users] Issue with single precision complex numbers in petsc4py In-Reply-To: <4DEAFEB7-0ABF-48C5-A893-3B75C2BC6589@gmail.com> References: <4DEAFEB7-0ABF-48C5-A893-3B75C2BC6589@gmail.com> Message-ID: Hi Stefano, Sure, please see the following. The PETSC_ARCH field is empty in the printout despite the fact that it was set to 'arch-linux-c-opt' in the shell. {'PETSC_DIR': '/home/pesun/.emopt', 'PETSC_ARCH': ''} Best regards, Peng Sun ________________________________ From: Stefano Zampini Sent: Thursday, October 13, 2022 1:57 PM To: Matthew Knepley Cc: Peng Sun ; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py Matt Yes, petsc4py does the right thing. This is probably. Picking up the wrong PETSc arch. Peng, can you please run this? import petsc4py petsc4py.init() print(petsc4py.get_config()) On Oct 13, 2022, at 11:23 PM, Matthew Knepley > wrote: Lisandro, PETSc is compiled for single. Does petsc4py respect this, or does it always use double for getArray() and friends? Thanks, Matt On Thu, Oct 13, 2022 at 11:42 AM Peng Sun > wrote: Hi Matt, Sure, please see the attached configure.log file. Thanks! Best regards, Peng Sun ________________________________ From: Matthew Knepley > Sent: Thursday, October 13, 2022 6:34 AM To: Peng Sun > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py First send configure.log so we can see the setup. Thanks, Matt On Thu, Oct 13, 2022 at 12:53 AM Peng Sun > wrote: Dear PETSc community, I have a question regarding the single precision complex numbers of petsc4py. I configured PETSc with the ?--with-scalar-type=complex --with-precision=single" option before compiling, but all the DA structures I created with petsc4py had double precision. Here is a minimum test code on Python 3.8/PETSc 3.12/petsc4py 3.12: both print commands show data type of complex128. Could anybody please help me? Thanks! import petsc4py import sys petsc4py.init(sys.argv) from petsc4py import PETSc da=PETSc.DA().create(sizes=[2,2,2],dof=1,stencil_type=0,stencil_width=1,boundary_type=1) da_1 = da.createGlobalVec() print(petsc4py.PETSc.ComplexType) print(da_1.getArray().dtype) Best regards, Peng Sun -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Oct 13 17:57:16 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 13 Oct 2022 18:57:16 -0400 Subject: [petsc-users] Issue with single precision complex numbers in petsc4py In-Reply-To: References: <4DEAFEB7-0ABF-48C5-A893-3B75C2BC6589@gmail.com> Message-ID: On Thu, Oct 13, 2022 at 5:27 PM Peng Sun wrote: > Hi Stefano, > > Sure, please see the following. The PETSC_ARCH field is empty in the > printout despite the fact that it was set to 'arch-linux-c-opt' in the > shell. > > {'PETSC_DIR': '/home/pesun/.emopt', 'PETSC_ARCH': ''} > Can you show the whole output? Also, did you remember to 'export' it so that it goes to subshells? Matt > Best regards, > Peng Sun > ------------------------------ > *From:* Stefano Zampini > *Sent:* Thursday, October 13, 2022 1:57 PM > *To:* Matthew Knepley > *Cc:* Peng Sun ; petsc-users at mcs.anl.gov < > petsc-users at mcs.anl.gov> > *Subject:* Re: [petsc-users] Issue with single precision complex numbers > in petsc4py > > Matt > > Yes, petsc4py does the right thing. This is probably. Picking up the wrong > PETSc arch. > > Peng, can you please run this? > > import petsc4py > petsc4py.init() > print(petsc4py.get_config()) > > On Oct 13, 2022, at 11:23 PM, Matthew Knepley wrote: > > Lisandro, > > PETSc is compiled for single. Does petsc4py respect this, or does it > always use double for getArray() and friends? > > Thanks, > > Matt > > On Thu, Oct 13, 2022 at 11:42 AM Peng Sun wrote: > > Hi Matt, > > Sure, please see the attached configure.log file. Thanks! > > Best regards, > Peng Sun > > > ------------------------------ > *From:* Matthew Knepley > *Sent:* Thursday, October 13, 2022 6:34 AM > *To:* Peng Sun > *Cc:* petsc-users at mcs.anl.gov > *Subject:* Re: [petsc-users] Issue with single precision complex numbers > in petsc4py > > First send configure.log so we can see the setup. > > Thanks, > > Matt > > On Thu, Oct 13, 2022 at 12:53 AM Peng Sun wrote: > > Dear PETSc community, > > > I have a question regarding the single precision complex numbers of > petsc4py. I configured PETSc with the ?--with-scalar-type=complex > --with-precision=single" option before compiling, but all the DA structures > I created with petsc4py had double precision. > > > Here is a minimum test code on Python 3.8/PETSc 3.12/petsc4py 3.12: both > print commands show data type of complex128. Could anybody please help > me? Thanks! > > > import petsc4pyimport sys > petsc4py.init(sys.argv)from petsc4py import PETSc > > da=PETSc.DA().create(sizes=[2,2,2],dof=1,stencil_type=0,stencil_width=1,boundary_type=1) > da_1 = da.createGlobalVec()print(petsc4py.PETSc.ComplexType)print(da_1.getArray().dtype) > > > > > Best regards, > > Peng Sun > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From psun at outlook.com Thu Oct 13 18:16:43 2022 From: psun at outlook.com (Peng Sun) Date: Thu, 13 Oct 2022 23:16:43 +0000 Subject: [petsc-users] Issue with single precision complex numbers in petsc4py In-Reply-To: References: <4DEAFEB7-0ABF-48C5-A893-3B75C2BC6589@gmail.com> Message-ID: Hi Matt, Please see the following screenshot. Yes, I exported the PETSC_ARCH variable before running the script. Note the "Invalid MIT-MAGIC-COOKIE-1 key" string is related to the X server and in all Python printout, not on PETSc. [cid:95e2940b-842a-4f1b-a542-bad8737d3b38] Best regards, Peng Sun ________________________________ From: Matthew Knepley Sent: Thursday, October 13, 2022 3:57 PM To: Peng Sun Cc: Stefano Zampini ; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py On Thu, Oct 13, 2022 at 5:27 PM Peng Sun > wrote: Hi Stefano, Sure, please see the following. The PETSC_ARCH field is empty in the printout despite the fact that it was set to 'arch-linux-c-opt' in the shell. {'PETSC_DIR': '/home/pesun/.emopt', 'PETSC_ARCH': ''} Can you show the whole output? Also, did you remember to 'export' it so that it goes to subshells? Matt Best regards, Peng Sun ________________________________ From: Stefano Zampini > Sent: Thursday, October 13, 2022 1:57 PM To: Matthew Knepley > Cc: Peng Sun >; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py Matt Yes, petsc4py does the right thing. This is probably. Picking up the wrong PETSc arch. Peng, can you please run this? import petsc4py petsc4py.init() print(petsc4py.get_config()) On Oct 13, 2022, at 11:23 PM, Matthew Knepley > wrote: Lisandro, PETSc is compiled for single. Does petsc4py respect this, or does it always use double for getArray() and friends? Thanks, Matt On Thu, Oct 13, 2022 at 11:42 AM Peng Sun > wrote: Hi Matt, Sure, please see the attached configure.log file. Thanks! Best regards, Peng Sun ________________________________ From: Matthew Knepley > Sent: Thursday, October 13, 2022 6:34 AM To: Peng Sun > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py First send configure.log so we can see the setup. Thanks, Matt On Thu, Oct 13, 2022 at 12:53 AM Peng Sun > wrote: Dear PETSc community, I have a question regarding the single precision complex numbers of petsc4py. I configured PETSc with the ?--with-scalar-type=complex --with-precision=single" option before compiling, but all the DA structures I created with petsc4py had double precision. Here is a minimum test code on Python 3.8/PETSc 3.12/petsc4py 3.12: both print commands show data type of complex128. Could anybody please help me? Thanks! import petsc4py import sys petsc4py.init(sys.argv) from petsc4py import PETSc da=PETSc.DA().create(sizes=[2,2,2],dof=1,stencil_type=0,stencil_width=1,boundary_type=1) da_1 = da.createGlobalVec() print(petsc4py.PETSc.ComplexType) print(da_1.getArray().dtype) Best regards, Peng Sun -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 30445 bytes Desc: image.png URL: From hongzhang at anl.gov Thu Oct 13 18:30:35 2022 From: hongzhang at anl.gov (Zhang, Hong) Date: Thu, 13 Oct 2022 23:30:35 +0000 Subject: [petsc-users] Issue with single precision complex numbers in petsc4py In-Reply-To: References: Message-ID: It seems that you installed petsc4py separately. I would suggest to add the configure option --with-petsc4py=1 and follow the instructions to set PYTHONPATH before using petsc4py. Hong (Mr.) > On Oct 13, 2022, at 10:42 AM, Peng Sun wrote: > > Hi Matt, > > Sure, please see the attached configure.log file. Thanks! > > Best regards, > Peng Sun > > > From: Matthew Knepley > Sent: Thursday, October 13, 2022 6:34 AM > To: Peng Sun > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py > > First send configure.log so we can see the setup. > > Thanks, > > Matt > > On Thu, Oct 13, 2022 at 12:53 AM Peng Sun wrote: > Dear PETSc community, > > I have a question regarding the single precision complex numbers of petsc4py. I configured PETSc with the ?--with-scalar-type=complex --with-precision=single" option before compiling, but all the DA structures I created with petsc4py had double precision. > > Here is a minimum test code on Python 3.8/PETSc 3.12/petsc4py 3.12: both print commands show data type of complex128. Could anybody please help me? Thanks! > > import > petsc4py > > import > sys > petsc4py.init(sys.argv) > > from petsc4py import > PETSc > > da=PETSc.DA().create(sizes=[ > 2,2,2],dof=1,stencil_type=0,stencil_width=1,boundary_type=1 > ) > da_1 = da.createGlobalVec() > > print > (petsc4py.PETSc.ComplexType) > > print(da_1.getArray().dtype) > > > > Best regards, > Peng Sun > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > From psun at outlook.com Thu Oct 13 19:53:51 2022 From: psun at outlook.com (Peng Sun) Date: Fri, 14 Oct 2022 00:53:51 +0000 Subject: [petsc-users] Issue with single precision complex numbers in petsc4py In-Reply-To: References: Message-ID: Hi Hong, Thanks for the advice. I could not install petsc4py with the --with-petsc4py=1 option, which gave me an "No rule to make target 'petsc4py-install'" error when I ran "make install". That was why I needed to install petsc4py separately after the PETSc was installed. Best regards, Peng Sun ________________________________ From: Zhang, Hong Sent: Thursday, October 13, 2022 4:30 PM To: Peng Sun Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py It seems that you installed petsc4py separately. I would suggest to add the configure option --with-petsc4py=1 and follow the instructions to set PYTHONPATH before using petsc4py. Hong (Mr.) > On Oct 13, 2022, at 10:42 AM, Peng Sun wrote: > > Hi Matt, > > Sure, please see the attached configure.log file. Thanks! > > Best regards, > Peng Sun > > > From: Matthew Knepley > Sent: Thursday, October 13, 2022 6:34 AM > To: Peng Sun > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py > > First send configure.log so we can see the setup. > > Thanks, > > Matt > > On Thu, Oct 13, 2022 at 12:53 AM Peng Sun wrote: > Dear PETSc community, > > I have a question regarding the single precision complex numbers of petsc4py. I configured PETSc with the ?--with-scalar-type=complex --with-precision=single" option before compiling, but all the DA structures I created with petsc4py had double precision. > > Here is a minimum test code on Python 3.8/PETSc 3.12/petsc4py 3.12: both print commands show data type of complex128. Could anybody please help me? Thanks! > > import > petsc4py > > import > sys > petsc4py.init(sys.argv) > > from petsc4py import > PETSc > > da=PETSc.DA().create(sizes=[ > 2,2,2],dof=1,stencil_type=0,stencil_width=1,boundary_type=1 > ) > da_1 = da.createGlobalVec() > > print > (petsc4py.PETSc.ComplexType) > > print(da_1.getArray().dtype) > > > > Best regards, > Peng Sun > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Thu Oct 13 21:03:04 2022 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 13 Oct 2022 22:03:04 -0400 Subject: [petsc-users] Issue with single precision complex numbers in petsc4py In-Reply-To: References: Message-ID: <2CFD3281-2BB5-4320-A29B-069A4CF7A590@petsc.dev> Is there any reason you can't use the most recent version of PETSc4py? The one you are working with is several years old > On Oct 13, 2022, at 8:53 PM, Peng Sun wrote: > > Hi Hong, > > Thanks for the advice. I could not install petsc4py with the --with-petsc4py=1 option, which gave me an "No rule to make target 'petsc4py-install'" error when I ran "make install". That was why I needed to install petsc4py separately after the PETSc was installed. > > Best regards, > Peng Sun > From: Zhang, Hong > > Sent: Thursday, October 13, 2022 4:30 PM > To: Peng Sun > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py > > It seems that you installed petsc4py separately. I would suggest to add the configure option --with-petsc4py=1 and follow the instructions to set PYTHONPATH before using petsc4py. > > Hong (Mr.) > > > On Oct 13, 2022, at 10:42 AM, Peng Sun > wrote: > > > > Hi Matt, > > > > Sure, please see the attached configure.log file. Thanks! > > > > Best regards, > > Peng Sun > > > > > > From: Matthew Knepley > > > Sent: Thursday, October 13, 2022 6:34 AM > > To: Peng Sun > > > Cc: petsc-users at mcs.anl.gov > > > Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py > > > > First send configure.log so we can see the setup. > > > > Thanks, > > > > Matt > > > > On Thu, Oct 13, 2022 at 12:53 AM Peng Sun > wrote: > > Dear PETSc community, > > > > I have a question regarding the single precision complex numbers of petsc4py. I configured PETSc with the ?--with-scalar-type=complex --with-precision=single" option before compiling, but all the DA structures I created with petsc4py had double precision. > > > > Here is a minimum test code on Python 3.8/PETSc 3.12/petsc4py 3.12: both print commands show data type of complex128. Could anybody please help me? Thanks! > > > > import > > petsc4py > > > > import > > sys > > petsc4py.init(sys.argv) > > > > from petsc4py import > > PETSc > > > > da=PETSc.DA().create(sizes=[ > > 2,2,2],dof=1,stencil_type=0,stencil_width=1,boundary_type=1 > > ) > > da_1 = da.createGlobalVec() > > > > print > > (petsc4py.PETSc.ComplexType) > > > > print(da_1.getArray().dtype) > > > > > > > > Best regards, > > Peng Sun > > > > > > -- > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From martin.diehl at kuleuven.be Fri Oct 14 02:37:10 2022 From: martin.diehl at kuleuven.be (Martin Diehl) Date: Fri, 14 Oct 2022 07:37:10 +0000 Subject: [petsc-users] compiling with Intel 2022.3 Message-ID: <3d45261d09a1a5f505092ea185520f51fb72baec.camel@kuleuven.be> FYI: Classic Intel compilers/MPI wrappers give a warning that confuses configure.py. The solution is to add "-diag-disable=10441" when invoking the compiler. source: https://community.intel.com/t5/Intel-C-Compiler/undefined-reference-for-2022-3-mpiicc/td-p/1420526 -- KU Leuven Department of Computer Science Department of Materials Engineering Celestijnenlaan 200a 3001 Leuven, Belgium From pierre at joliv.et Fri Oct 14 02:57:39 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Fri, 14 Oct 2022 09:57:39 +0200 Subject: [petsc-users] compiling with Intel 2022.3 In-Reply-To: <3d45261d09a1a5f505092ea185520f51fb72baec.camel@kuleuven.be> References: <3d45261d09a1a5f505092ea185520f51fb72baec.camel@kuleuven.be> Message-ID: <43472A6F-1BD5-4E2F-9D9D-39E96C7B47F6@joliv.et> Or you can use the up-to-date release branch, cf. https://gitlab.com/petsc/petsc/-/merge_requests/5727 Thanks, Pierre > On 14 Oct 2022, at 9:52 AM, Martin Diehl wrote: > > ?FYI: Classic Intel compilers/MPI wrappers give a warning that confuses > configure.py. The solution is to add "-diag-disable=10441" when > invoking the compiler. > > source: > https://community.intel.com/t5/Intel-C-Compiler/undefined-reference-for-2022-3-mpiicc/td-p/1420526 > > -- > KU Leuven > Department of Computer Science > Department of Materials Engineering > Celestijnenlaan 200a > 3001 Leuven, Belgium > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.zampini at gmail.com Fri Oct 14 06:36:52 2022 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Fri, 14 Oct 2022 14:36:52 +0300 Subject: [petsc-users] Issue with single precision complex numbers in petsc4py In-Reply-To: References: Message-ID: <51D26C9C-3CCF-41D5-9337-42E17E008429@gmail.com> > On Oct 14, 2022, at 3:53 AM, Peng Sun wrote: > > Hi Hong, > > Thanks for the advice. I could not install petsc4py with the --with-petsc4py=1 option, which gave me an "No rule to make target 'petsc4py-install'" error when I ran "make install". That was why I needed to install petsc4py separately after the PETSc was installed. After you installed PETSc, go to src/binding/petsc4py and run make install there. It will install in .local and it will be visible to python. Is this how you installed it? > > Best regards, > Peng Sun > From: Zhang, Hong > Sent: Thursday, October 13, 2022 4:30 PM > To: Peng Sun > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py > > It seems that you installed petsc4py separately. I would suggest to add the configure option --with-petsc4py=1 and follow the instructions to set PYTHONPATH before using petsc4py. > > Hong (Mr.) > > > On Oct 13, 2022, at 10:42 AM, Peng Sun wrote: > > > > Hi Matt, > > > > Sure, please see the attached configure.log file. Thanks! > > > > Best regards, > > Peng Sun > > > > > > From: Matthew Knepley > > Sent: Thursday, October 13, 2022 6:34 AM > > To: Peng Sun > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py > > > > First send configure.log so we can see the setup. > > > > Thanks, > > > > Matt > > > > On Thu, Oct 13, 2022 at 12:53 AM Peng Sun wrote: > > Dear PETSc community, > > > > I have a question regarding the single precision complex numbers of petsc4py. I configured PETSc with the ?--with-scalar-type=complex --with-precision=single" option before compiling, but all the DA structures I created with petsc4py had double precision. > > > > Here is a minimum test code on Python 3.8/PETSc 3.12/petsc4py 3.12: both print commands show data type of complex128. Could anybody please help me? Thanks! > > > > import > > petsc4py > > > > import > > sys > > petsc4py.init(sys.argv) > > > > from petsc4py import > > PETSc > > > > da=PETSc.DA().create(sizes=[ > > 2,2,2],dof=1,stencil_type=0,stencil_width=1,boundary_type=1 > > ) > > da_1 = da.createGlobalVec() > > > > print > > (petsc4py.PETSc.ComplexType) > > > > print(da_1.getArray().dtype) > > > > > > > > Best regards, > > Peng Sun > > > > > > -- > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From psun at outlook.com Fri Oct 14 11:42:09 2022 From: psun at outlook.com (Peng Sun) Date: Fri, 14 Oct 2022 16:42:09 +0000 Subject: [petsc-users] Issue with single precision complex numbers in petsc4py In-Reply-To: <2CFD3281-2BB5-4320-A29B-069A4CF7A590@petsc.dev> References: <2CFD3281-2BB5-4320-A29B-069A4CF7A590@petsc.dev> Message-ID: Hi Barry, I overwrote the old v3.12.0 with the latest version of 3.18.0, and I could use single-precision complex successfully. Then I switched back to v3.12.0 and I can reproduce the same issue. So it seems like that the single-precision complex issue is specific to v3.12.0. I do not necessarily need to use the old version. Thanks for your help! Best regards, Peng Sun ________________________________ From: Barry Smith Sent: Thursday, October 13, 2022 7:03 PM To: Peng Sun Cc: Zhang, Hong ; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py Is there any reason you can't use the most recent version of PETSc4py? The one you are working with is several years old On Oct 13, 2022, at 8:53 PM, Peng Sun > wrote: Hi Hong, Thanks for the advice. I could not install petsc4py with the --with-petsc4py=1 option, which gave me an "No rule to make target 'petsc4py-install'" error when I ran "make install". That was why I needed to install petsc4py separately after the PETSc was installed. Best regards, Peng Sun ________________________________ From: Zhang, Hong > Sent: Thursday, October 13, 2022 4:30 PM To: Peng Sun > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py It seems that you installed petsc4py separately. I would suggest to add the configure option --with-petsc4py=1 and follow the instructions to set PYTHONPATH before using petsc4py. Hong (Mr.) > On Oct 13, 2022, at 10:42 AM, Peng Sun > wrote: > > Hi Matt, > > Sure, please see the attached configure.log file. Thanks! > > Best regards, > Peng Sun > > > From: Matthew Knepley > > Sent: Thursday, October 13, 2022 6:34 AM > To: Peng Sun > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py > > First send configure.log so we can see the setup. > > Thanks, > > Matt > > On Thu, Oct 13, 2022 at 12:53 AM Peng Sun > wrote: > Dear PETSc community, > > I have a question regarding the single precision complex numbers of petsc4py. I configured PETSc with the ?--with-scalar-type=complex --with-precision=single" option before compiling, but all the DA structures I created with petsc4py had double precision. > > Here is a minimum test code on Python 3.8/PETSc 3.12/petsc4py 3.12: both print commands show data type of complex128. Could anybody please help me? Thanks! > > import > petsc4py > > import > sys > petsc4py.init(sys.argv) > > from petsc4py import > PETSc > > da=PETSc.DA().create(sizes=[ > 2,2,2],dof=1,stencil_type=0,stencil_width=1,boundary_type=1 > ) > da_1 = da.createGlobalVec() > > print > (petsc4py.PETSc.ComplexType) > > print(da_1.getArray().dtype) > > > > Best regards, > Peng Sun > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From psun at outlook.com Fri Oct 14 11:46:18 2022 From: psun at outlook.com (Peng Sun) Date: Fri, 14 Oct 2022 16:46:18 +0000 Subject: [petsc-users] Issue with single precision complex numbers in petsc4py In-Reply-To: <51D26C9C-3CCF-41D5-9337-42E17E008429@gmail.com> References: <51D26C9C-3CCF-41D5-9337-42E17E008429@gmail.com> Message-ID: Hi Stefano, No I used pip to install petsc4py after I installed PETSc. I did not see the binding folder under /src. Best regards, Peng Sun ________________________________ From: Stefano Zampini Sent: Friday, October 14, 2022 4:36 AM To: Peng Sun Cc: Zhang, Hong ; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py On Oct 14, 2022, at 3:53 AM, Peng Sun > wrote: Hi Hong, Thanks for the advice. I could not install petsc4py with the --with-petsc4py=1 option, which gave me an "No rule to make target 'petsc4py-install'" error when I ran "make install". That was why I needed to install petsc4py separately after the PETSc was installed. After you installed PETSc, go to src/binding/petsc4py and run make install there. It will install in .local and it will be visible to python. Is this how you installed it? Best regards, Peng Sun ________________________________ From: Zhang, Hong > Sent: Thursday, October 13, 2022 4:30 PM To: Peng Sun > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py It seems that you installed petsc4py separately. I would suggest to add the configure option --with-petsc4py=1 and follow the instructions to set PYTHONPATH before using petsc4py. Hong (Mr.) > On Oct 13, 2022, at 10:42 AM, Peng Sun > wrote: > > Hi Matt, > > Sure, please see the attached configure.log file. Thanks! > > Best regards, > Peng Sun > > > From: Matthew Knepley > > Sent: Thursday, October 13, 2022 6:34 AM > To: Peng Sun > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Issue with single precision complex numbers in petsc4py > > First send configure.log so we can see the setup. > > Thanks, > > Matt > > On Thu, Oct 13, 2022 at 12:53 AM Peng Sun > wrote: > Dear PETSc community, > > I have a question regarding the single precision complex numbers of petsc4py. I configured PETSc with the ?--with-scalar-type=complex --with-precision=single" option before compiling, but all the DA structures I created with petsc4py had double precision. > > Here is a minimum test code on Python 3.8/PETSc 3.12/petsc4py 3.12: both print commands show data type of complex128. Could anybody please help me? Thanks! > > import > petsc4py > > import > sys > petsc4py.init(sys.argv) > > from petsc4py import > PETSc > > da=PETSc.DA().create(sizes=[ > 2,2,2],dof=1,stencil_type=0,stencil_width=1,boundary_type=1 > ) > da_1 = da.createGlobalVec() > > print > (petsc4py.PETSc.ComplexType) > > print(da_1.getArray().dtype) > > > > Best regards, > Peng Sun > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.zampini at gmail.com Fri Oct 14 12:41:54 2022 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Fri, 14 Oct 2022 20:41:54 +0300 Subject: [petsc-users] Issue with single precision complex numbers in petsc4py In-Reply-To: References: <51D26C9C-3CCF-41D5-9337-42E17E008429@gmail.com> Message-ID: On Fri, Oct 14, 2022, 19:46 Peng Sun wrote: > Hi Stefano, > > No I used pip to install petsc4py after I installed PETSc. I did not see > the binding folder under /src > Not sure which was the first version petsc4py was shipped with PETSc , for sure 3.18 has it. Best regards, > Peng Sun > ------------------------------ > *From:* Stefano Zampini > *Sent:* Friday, October 14, 2022 4:36 AM > *To:* Peng Sun > *Cc:* Zhang, Hong ; petsc-users at mcs.anl.gov < > petsc-users at mcs.anl.gov> > *Subject:* Re: [petsc-users] Issue with single precision complex numbers > in petsc4py > > > > On Oct 14, 2022, at 3:53 AM, Peng Sun wrote: > > Hi Hong, > > Thanks for the advice. I could not install petsc4py with the > --with-petsc4py=1 option, which gave me an "No rule to make target > 'petsc4py-install'" error when I ran "make install". That was why I > needed to install petsc4py separately after the PETSc was installed. > > > > After you installed PETSc, go to src/binding/petsc4py and run make install > there. It will install in .local and it will be visible to python. > Is this how you installed it? > > > Best regards, > Peng Sun > ------------------------------ > *From:* Zhang, Hong > *Sent:* Thursday, October 13, 2022 4:30 PM > *To:* Peng Sun > *Cc:* petsc-users at mcs.anl.gov > *Subject:* Re: [petsc-users] Issue with single precision complex numbers > in petsc4py > > It seems that you installed petsc4py separately. I would suggest to add > the configure option --with-petsc4py=1 and follow the instructions to set > PYTHONPATH before using petsc4py. > > Hong (Mr.) > > > On Oct 13, 2022, at 10:42 AM, Peng Sun wrote: > > > > Hi Matt, > > > > Sure, please see the attached configure.log file. Thanks! > > > > Best regards, > > Peng Sun > > > > > > From: Matthew Knepley > > Sent: Thursday, October 13, 2022 6:34 AM > > To: Peng Sun > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Issue with single precision complex numbers > in petsc4py > > > > First send configure.log so we can see the setup. > > > > Thanks, > > > > Matt > > > > On Thu, Oct 13, 2022 at 12:53 AM Peng Sun wrote: > > Dear PETSc community, > > > > I have a question regarding the single precision complex numbers of > petsc4py. I configured PETSc with the ?--with-scalar-type=complex > --with-precision=single" option before compiling, but all the DA structures > I created with petsc4py had double precision. > > > > Here is a minimum test code on Python 3.8/PETSc 3.12/petsc4py 3.12: both > print commands show data type of complex128. Could anybody please help > me? Thanks! > > > > import > > petsc4py > > > > import > > sys > > petsc4py.init(sys.argv) > > > > from petsc4py import > > PETSc > > > > da=PETSc.DA().create(sizes=[ > > 2,2,2],dof=1,stencil_type=0,stencil_width=1,boundary_type=1 > > ) > > da_1 = da.createGlobalVec() > > > > print > > (petsc4py.PETSc.ComplexType) > > > > print(da_1.getArray().dtype) > > > > > > > > Best regards, > > Peng Sun > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fujisan43 at gmail.com Mon Oct 17 01:37:42 2022 From: fujisan43 at gmail.com (fujisan) Date: Mon, 17 Oct 2022 08:37:42 +0200 Subject: [petsc-users] Initializing a large sparse matrix Message-ID: Hi everyone, I initialize a large sparse matrix (50000x20000) using MatCreate() and then filling it with MatSetValues() line by line but it takes a bit more than an hour on 80 cores to fill in the matrix. Is there a way to optimize this initialization? Fuji -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Mon Oct 17 02:02:37 2022 From: jroman at dsic.upv.es (Jose E. Roman) Date: Mon, 17 Oct 2022 09:02:37 +0200 Subject: [petsc-users] Initializing a large sparse matrix In-Reply-To: References: Message-ID: You have to preallocate, see https://petsc.org/release/docs/manual/mat/#sec-matsparse > El 17 oct 2022, a las 8:37, fujisan escribi?: > > Hi everyone, > > I initialize a large sparse matrix (50000x20000) using MatCreate() and then filling it with MatSetValues() line by line > but it takes a bit more than an hour on 80 cores to fill in the matrix. > > Is there a way to optimize this initialization? > > Fuji From fujisan43 at gmail.com Mon Oct 17 03:12:55 2022 From: fujisan43 at gmail.com (fujisan) Date: Mon, 17 Oct 2022 10:12:55 +0200 Subject: [petsc-users] Initializing a large sparse matrix In-Reply-To: References: Message-ID: Can I preallocate space for a rectangular matrix using MatCreateAIJ() in parallel? It is not clear to me how I have to define d_nnz when the matrix is rectangular? X * * * * * X * * * * * X * * * * * X * * * * * X * * * * * * * * * * The example shown is for an 8x8 matrix on 3 cpu X * * * X * * * X On Mon, Oct 17, 2022 at 9:02 AM Jose E. Roman wrote: > You have to preallocate, see > https://petsc.org/release/docs/manual/mat/#sec-matsparse > > > El 17 oct 2022, a las 8:37, fujisan escribi?: > > > > Hi everyone, > > > > I initialize a large sparse matrix (50000x20000) using MatCreate() and > then filling it with MatSetValues() line by line > > but it takes a bit more than an hour on 80 cores to fill in the matrix. > > > > Is there a way to optimize this initialization? > > > > Fuji > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Mon Oct 17 03:17:07 2022 From: jroman at dsic.upv.es (Jose E. Roman) Date: Mon, 17 Oct 2022 10:17:07 +0200 Subject: [petsc-users] Initializing a large sparse matrix In-Reply-To: References: Message-ID: There are examples in the manpages, see for instance https://petsc.org/release/docs/manualpages/Mat/MatCreateAIJ/ https://petsc.org/release/docs/manualpages/Mat/MatMPIAIJSetPreallocation/ Jose > El 17 oct 2022, a las 10:12, fujisan escribi?: > > Can I preallocate space for a rectangular matrix using MatCreateAIJ() in parallel? > It is not clear to me how I have to define d_nnz when the matrix is rectangular? > > X * * * * > * X * * * > * * X * * > * * * X * > * * * * X > * * * * * > * * * * * > > The example shown is for an 8x8 matrix on 3 cpu > X * * > * X * > * * X > > On Mon, Oct 17, 2022 at 9:02 AM Jose E. Roman wrote: > You have to preallocate, see https://petsc.org/release/docs/manual/mat/#sec-matsparse > > > El 17 oct 2022, a las 8:37, fujisan escribi?: > > > > Hi everyone, > > > > I initialize a large sparse matrix (50000x20000) using MatCreate() and then filling it with MatSetValues() line by line > > but it takes a bit more than an hour on 80 cores to fill in the matrix. > > > > Is there a way to optimize this initialization? > > > > Fuji > From zhaog6 at lsec.cc.ac.cn Wed Oct 19 03:01:21 2022 From: zhaog6 at lsec.cc.ac.cn (=?UTF-8?B?6LW15Yia?=) Date: Wed, 19 Oct 2022 16:01:21 +0800 (GMT+08:00) Subject: [petsc-users] An issue of Interior-Point Methods in TAO Message-ID: <52f60c40.cfe0.183ef42e3f1.Coremail.zhaog6@lsec.cc.ac.cn> Dear PETSc/TAO team, I am using an interior-point method in TAO to solve a quadratic programming problem with bound constraints. I noticed that TAO includes three interior-point methods, they are Mehrotra Predictor-Corrector Method (bqpip), Primal-Dual Interior-Point Method (pdipm) and "ipm", respectively. I'd like to ask what is the interior-point method implemented by "-tao_type ipm", thank you. Best Regards, Gang From nicolas.tardieu at edf.fr Wed Oct 19 05:00:37 2022 From: nicolas.tardieu at edf.fr (TARDIEU Nicolas) Date: Wed, 19 Oct 2022 10:00:37 +0000 Subject: [petsc-users] Trouble with ISEmbed Message-ID: Dear PETSc Team, I am trying to use IS embeding in parallel. In order to (try to) understand how it works, I have built a simple example, attached to this email. I consider a 20X20 matrix. The dof (u, p, t) in global numbering are the following : u: 0..9 p: 10..14 t: 15..19 I have defined 4 IS to describe the dof u, p, t and the agglomeration of u and p, called up. I first extract the submatrix matrix(up,up), then I would like to extract from it the (u,u) block. The example runs OK in sequential but I do not obtain the (u,u) block on 2 processes. I have a mistake in the build of the sub-IS but I cannot find it for days. Best regards, Nicolas -- Nicolas Tardieu Ing PhD Computational Mechanics EDF - R&D Dpt ERMES PARIS-SACLAY, FRANCE Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. ____________________________________________________ This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. E-mail communication cannot be guaranteed to be timely secure, error or virus-free. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: is_pb.tgz Type: application/x-compressed-tar Size: 1222 bytes Desc: is_pb.tgz URL: From pierre at joliv.et Wed Oct 19 07:51:48 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Wed, 19 Oct 2022 14:51:48 +0200 Subject: [petsc-users] Trouble with ISEmbed In-Reply-To: References: Message-ID: On two processes, you have a different distribution for u and u+p. IS Object: 2 MPI processes type: general [0] Number of indices in set 5 [0] 0 0 [0] 1 1 [0] 2 2 [0] 3 3 [0] 4 4 [1] Number of indices in set 5 [1] 0 5 [1] 1 6 [1] 2 7 [1] 3 8 [1] 4 9 IS Object: 2 MPI processes type: general [0] Number of indices in set 8 [0] 0 0 [0] 1 1 [0] 2 2 [0] 3 3 [0] 4 4 [0] 5 5 [0] 6 6 [0] 7 7 [1] Number of indices in set 7 [1] 0 8 [1] 1 9 [1] 2 10 [1] 3 11 [1] 4 12 [1] 5 13 [1] 6 14 ISEmbed() only works on local indices, so when you embed u into u+p, on the second process, you miss the row/column indices 5, 6, 7 of B = A(u+p, u+p). Thus, you end up with a matrix of dimension size(u) - 3 = 10 - 3 = 7, with just the row/column indices 8 and 9 being selected by the second process. What is it that you want to do exactly? Play with ISEmbed(), or get A(u, u) without using A but B instead? Thanks, Pierre > On 19 Oct 2022, at 12:00 PM, TARDIEU Nicolas via petsc-users wrote: > > Dear PETSc Team, > > I am trying to use IS embeding in parallel. > In order to (try to) understand how it works, I have built a simple example, attached to this email. > > I consider a 20X20 matrix. The dof (u, p, t) in global numbering are the following : > u: 0..9 p: 10..14 t: 15..19 > > I have defined 4 IS to describe the dof u, p, t and the agglomeration of u and p, called up. > I first extract the submatrix matrix(up,up), then I would like to extract from it the (u,u) block. > > The example runs OK in sequential but I do not obtain the (u,u) block on 2 processes. > > I have a mistake in the build of the sub-IS but I cannot find it for days. > > Best regards, > Nicolas > -- > Nicolas Tardieu > Ing PhD Computational Mechanics > EDF - R&D Dpt ERMES > PARIS-SACLAY, FRANCE > > Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. > Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. > Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. > ____________________________________________________ > This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. > If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. > E-mail communication cannot be guaranteed to be timely secure, error or virus-free. > From yuanxi at advancesoft.jp Wed Oct 19 07:59:55 2022 From: yuanxi at advancesoft.jp (=?UTF-8?B?6KKB54WV?=) Date: Wed, 19 Oct 2022 21:59:55 +0900 Subject: [petsc-users] how to reuse Mumps factorization Message-ID: Hello, I am using Mumps to solve a problem with multiple time steps. The matrix structure does not change but its value may or may not change during those steps. That means I should reuse the symbolic factorization but recall numeric factorization when needed. I have found the following anwser of a similar question https://lists.mcs.anl.gov/pipermail/petsc-users/2020-August/041846.html which says "it automatically uses the same factorization", but I don't know if it includes numerical factorization also. My question is : 1. Does numeric factorization do automatically? If not 2. Could I control when numeric factorization should be done and how to do it? Much thanks YUAN -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Oct 19 08:15:22 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 19 Oct 2022 09:15:22 -0400 Subject: [petsc-users] how to reuse Mumps factorization In-Reply-To: References: Message-ID: On Wed, Oct 19, 2022 at 9:13 AM ?? wrote: > Hello, > > I am using Mumps to solve a problem with multiple time steps. The matrix > structure does not change but its value may or may not change during > those steps. That means I should reuse the symbolic factorization but > recall numeric factorization when needed. > > I have found the following anwser of a similar question > https://lists.mcs.anl.gov/pipermail/petsc-users/2020-August/041846.html > > which says "it automatically uses the same factorization", but I don't > know if it includes numerical factorization also. > > My question is : > 1. Does numeric factorization do automatically? If not > Yes. Thanks, Matt > 2. Could I control when numeric factorization should be done and how to > do it? > > Much thanks > > YUAN > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ksi2443 at gmail.com Wed Oct 19 08:25:42 2022 From: ksi2443 at gmail.com (=?UTF-8?B?6rmA7ISx7J21?=) Date: Wed, 19 Oct 2022 22:25:42 +0900 Subject: [petsc-users] Question about Sequential & Parallel part Message-ID: Dear PETSc users, I have a question about structure of programming. My blueprint of Finite Element programming with PETSc solver is below. [image: image.png] The blue box is whole loop for my FE program. There is a loop B that performs a convergence iteration (KU=F) inside the loop A in which the load increment increases. I want to proceed in rank=0 sequentially except for the kspsolve part that solves KU=F. How can I do this?? Best regards, Hyung Kim -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 64551 bytes Desc: not available URL: From knepley at gmail.com Wed Oct 19 08:46:11 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 19 Oct 2022 09:46:11 -0400 Subject: [petsc-users] Question about Sequential & Parallel part In-Reply-To: References: Message-ID: On Wed, Oct 19, 2022 at 9:26 AM ??? wrote: > Dear PETSc users, > > > I have a question about structure of programming. > > My blueprint of Finite Element programming with PETSc solver is below. > [image: image.png] > The blue box is whole loop for my FE program. > There is a loop B that performs a convergence iteration (KU=F) inside the > loop A in which the load increment increases. > I want to proceed in rank=0 sequentially except for the kspsolve part that > solves KU=F. > > How can I do this?? > I think you want this: https://petsc.org/main/docs/manual/ksp/#using-a-mpi-parallel-linear-solver-from-a-non-mpi-program Thanks, Matt > Best regards, > Hyung Kim > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 64551 bytes Desc: not available URL: From nicolas.tardieu at edf.fr Wed Oct 19 09:32:51 2022 From: nicolas.tardieu at edf.fr (TARDIEU Nicolas) Date: Wed, 19 Oct 2022 14:32:51 +0000 Subject: [petsc-users] Trouble with ISEmbed In-Reply-To: References: Message-ID: Dear Pierre, Thank you very much for your answer. I have the same explanation as you for the code I sent. But what I would like to do is the following : I have the full matrix A with fields u, p and t (which are interlaced in the real application). I want to extract B=A(u+p, u+p). *Then* I would like to extract the (u, u) block from B - let us call it B_uu. In fact, B_uu=A_uu but I really need to do the extraction from B. And I am missing something since I have to play with different numberings when switching the IS from A to B. Is it clear enough ???? Regards, Nicolas -- Nicolas Tardieu Ing PhD Computational Mechanics EDF - R&D Dpt ERMES PARIS-SACLAY, FRANCE ________________________________ De : pierre at joliv.et Envoy? : mercredi 19 octobre 2022 14:51 ? : TARDIEU Nicolas Cc : petsc-users at mcs.anl.gov Objet : Re: [petsc-users] Trouble with ISEmbed On two processes, you have a different distribution for u and u+p. IS Object: 2 MPI processes type: general [0] Number of indices in set 5 [0] 0 0 [0] 1 1 [0] 2 2 [0] 3 3 [0] 4 4 [1] Number of indices in set 5 [1] 0 5 [1] 1 6 [1] 2 7 [1] 3 8 [1] 4 9 IS Object: 2 MPI processes type: general [0] Number of indices in set 8 [0] 0 0 [0] 1 1 [0] 2 2 [0] 3 3 [0] 4 4 [0] 5 5 [0] 6 6 [0] 7 7 [1] Number of indices in set 7 [1] 0 8 [1] 1 9 [1] 2 10 [1] 3 11 [1] 4 12 [1] 5 13 [1] 6 14 ISEmbed() only works on local indices, so when you embed u into u+p, on the second process, you miss the row/column indices 5, 6, 7 of B = A(u+p, u+p). Thus, you end up with a matrix of dimension size(u) - 3 = 10 - 3 = 7, with just the row/column indices 8 and 9 being selected by the second process. What is it that you want to do exactly? Play with ISEmbed(), or get A(u, u) without using A but B instead? Thanks, Pierre > On 19 Oct 2022, at 12:00 PM, TARDIEU Nicolas via petsc-users wrote: > > Dear PETSc Team, > > I am trying to use IS embeding in parallel. > In order to (try to) understand how it works, I have built a simple example, attached to this email. > > I consider a 20X20 matrix. The dof (u, p, t) in global numbering are the following : > u: 0..9 p: 10..14 t: 15..19 > > I have defined 4 IS to describe the dof u, p, t and the agglomeration of u and p, called up. > I first extract the submatrix matrix(up,up), then I would like to extract from it the (u,u) block. > > The example runs OK in sequential but I do not obtain the (u,u) block on 2 processes. > > I have a mistake in the build of the sub-IS but I cannot find it for days. > > Best regards, > Nicolas > -- > Nicolas Tardieu > Ing PhD Computational Mechanics > EDF - R&D Dpt ERMES > PARIS-SACLAY, FRANCE > > Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. > Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. > Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. > ____________________________________________________ > This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. > If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. > E-mail communication cannot be guaranteed to be timely secure, error or virus-free. > Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. ____________________________________________________ This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. E-mail communication cannot be guaranteed to be timely secure, error or virus-free. -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Wed Oct 19 09:50:16 2022 From: bsmith at petsc.dev (Barry Smith) Date: Wed, 19 Oct 2022 10:50:16 -0400 Subject: [petsc-users] An issue of Interior-Point Methods in TAO In-Reply-To: <52f60c40.cfe0.183ef42e3f1.Coremail.zhaog6@lsec.cc.ac.cn> References: <52f60c40.cfe0.183ef42e3f1.Coremail.zhaog6@lsec.cc.ac.cn> Message-ID: <1DFAEEDE-F238-49EB-B3A9-F8ACED478DB0@petsc.dev> It looks like it was started as a general framework for interior point methods but never finished? There is a note on its manual page "This algorithm is more of a place-holder for future constrained optimization algorithms and should not yet be used for large problems or production code." You are welcome to look at the source code and play with it but you should consider it unfinished and unsupported. Barry > On Oct 19, 2022, at 4:01 AM, ?? wrote: > > Dear PETSc/TAO team, > > I am using an interior-point method in TAO to solve a quadratic programming problem with bound constraints. I noticed that TAO includes three interior-point methods, they are Mehrotra Predictor-Corrector Method (bqpip), Primal-Dual Interior-Point Method (pdipm) and "ipm", respectively. I'd like to ask what is the interior-point method implemented by "-tao_type ipm", thank you. > > > Best Regards, > Gang From bsmith at petsc.dev Wed Oct 19 09:58:05 2022 From: bsmith at petsc.dev (Barry Smith) Date: Wed, 19 Oct 2022 10:58:05 -0400 Subject: [petsc-users] how to reuse Mumps factorization In-Reply-To: References: Message-ID: <92F8CDFD-572A-4889-9F06-07EBE429765A@petsc.dev> Every time a matrix entry gets changes PETSc tracks these changes so for the next KSP by default solve it repeats the numerical factorization if the matrix has changed. Otherwise it reuses the still current factorization. If you are calling KSP directly, you can call KSPSetReusePreconditioner() to prevent KSP from automatically performing a new factorization, so it will use the out-of-date preconditioner but if you use a KSPType of, for example, KSPGMRES, it will still solve the linear system correctly just taking some iterations. Reusing the preconditioner can be faster if the matrix does not change too much since a numerical factorization takes a lot of time If you use SNES you can control "lagging" the preconditioner with SNESSetLagPreconditioner() Barry > On Oct 19, 2022, at 9:15 AM, Matthew Knepley wrote: > > On Wed, Oct 19, 2022 at 9:13 AM ?? > wrote: > Hello, > > I am using Mumps to solve a problem with multiple time steps. The matrix structure does not change but its value may or may not change during those steps. That means I should reuse the symbolic factorization but recall numeric factorization when needed. > > I have found the following anwser of a similar question > https://lists.mcs.anl.gov/pipermail/petsc-users/2020-August/041846.html > > which says "it automatically uses the same factorization", but I don't know if it includes numerical factorization also. > > My question is : > 1. Does numeric factorization do automatically? If not > > Yes. > > Thanks, > > Matt > > 2. Could I control when numeric factorization should be done and how to do it? > > Much thanks > > YUAN > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre at joliv.et Wed Oct 19 10:01:43 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Wed, 19 Oct 2022 17:01:43 +0200 Subject: [petsc-users] Trouble with ISEmbed In-Reply-To: References: Message-ID: > On 19 Oct 2022, at 4:32 PM, TARDIEU Nicolas wrote: > > Dear Pierre, > > Thank you very much for your answer. I have the same explanation as you for the code I sent. > But what I would like to do is the following : I have the full matrix A with fields u, p and t (which are interlaced in the real application). I want to extract B=A(u+p, u+p). *Then* I would like to extract the (u, u) block from B - let us call it B_uu. > In fact, B_uu=A_uu but I really need to do the extraction from B. > And I am missing something since I have to play with different numberings when switching the IS from A to B. > > Is it clear enough ???? That?s cristal clear. If the fields are interlaced, that?s actually easier to do, because you preserve the distribution, and there is less data movement. I?ll try to fix your code in the case where the fields are interlaced if now one gives you another answer in the meantime. Thanks, Pierre > Regards, > Nicolas > -- > Nicolas Tardieu > Ing PhD Computational Mechanics > EDF - R&D Dpt ERMES > PARIS-SACLAY, FRANCE > De : pierre at joliv.et > Envoy? : mercredi 19 octobre 2022 14:51 > ? : TARDIEU Nicolas > Cc : petsc-users at mcs.anl.gov > Objet : Re: [petsc-users] Trouble with ISEmbed > > On two processes, you have a different distribution for u and u+p. > IS Object: 2 MPI processes > type: general > [0] Number of indices in set 5 > [0] 0 0 > [0] 1 1 > [0] 2 2 > [0] 3 3 > [0] 4 4 > [1] Number of indices in set 5 > [1] 0 5 > [1] 1 6 > [1] 2 7 > [1] 3 8 > [1] 4 9 > IS Object: 2 MPI processes > type: general > [0] Number of indices in set 8 > [0] 0 0 > [0] 1 1 > [0] 2 2 > [0] 3 3 > [0] 4 4 > [0] 5 5 > [0] 6 6 > [0] 7 7 > [1] Number of indices in set 7 > [1] 0 8 > [1] 1 9 > [1] 2 10 > [1] 3 11 > [1] 4 12 > [1] 5 13 > [1] 6 14 > ISEmbed() only works on local indices, so when you embed u into u+p, on the second process, you miss the row/column indices 5, 6, 7 of B = A(u+p, u+p). > Thus, you end up with a matrix of dimension size(u) - 3 = 10 - 3 = 7, with just the row/column indices 8 and 9 being selected by the second process. > What is it that you want to do exactly? Play with ISEmbed(), or get A(u, u) without using A but B instead? > > Thanks, > Pierre > > > On 19 Oct 2022, at 12:00 PM, TARDIEU Nicolas via petsc-users wrote: > > > > Dear PETSc Team, > > > > I am trying to use IS embeding in parallel. > > In order to (try to) understand how it works, I have built a simple example, attached to this email. > > > > I consider a 20X20 matrix. The dof (u, p, t) in global numbering are the following : > > u: 0..9 p: 10..14 t: 15..19 > > > > I have defined 4 IS to describe the dof u, p, t and the agglomeration of u and p, called up. > > I first extract the submatrix matrix(up,up), then I would like to extract from it the (u,u) block. > > > > The example runs OK in sequential but I do not obtain the (u,u) block on 2 processes. > > > > I have a mistake in the build of the sub-IS but I cannot find it for days. > > > > Best regards, > > Nicolas > > -- > > Nicolas Tardieu > > Ing PhD Computational Mechanics > > EDF - R&D Dpt ERMES > > PARIS-SACLAY, FRANCE > > > > Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. > > Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. > > Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. > > ____________________________________________________ > > This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. > > If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. > > E-mail communication cannot be guaranteed to be timely secure, error or virus-free. > > > > > Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. > Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. > Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. > ____________________________________________________ > This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. > If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. > E-mail communication cannot be guaranteed to be timely secure, error or virus-free. -------------- next part -------------- An HTML attachment was scrubbed... URL: From chenlonglong0099 at 163.com Wed Oct 19 10:30:54 2022 From: chenlonglong0099 at 163.com (Jackie Chan) Date: Wed, 19 Oct 2022 23:30:54 +0800 (CST) Subject: [petsc-users] Some questions about KSP type and PC type selection Message-ID: <2bcde0e5.691c.183f0de7611.Coremail.chenlonglong0099@163.com> Dear All? I hope you're having a nice day. In finite element problems, the stiffness matrix and load vector are constructed to calculate the displacement vector using DMCreateMatrix and DMCreateGlobalVector, respectively. For some reason, I need to make sure that the displacements of nodes on the opposite edges of two-dimensional structured grid domain satisfy relative displacement condition. For example, the displacements of two points with natural coordinates (x,Ymin) and (x,Ymax), i.e. the points are located on the upper and lower edges of 2D grid domain and have the same x-coordinates, are equal respectively. To achieve this, I need to add particularly big numbers to specific entries in stiffness matrix. In this way, the positions of big numbers are usually far away from each other and belong to different processes. I have tried many KSP types and PC settings to solve displacement vector. However, the final results are mostly incorrect. The best solver type I have tried is cg, but it still has some problems like excessive time consuming and convergence steps. So, for this problem, what kind of KSP type and PC type are suitable? Or is there a way to speed up the calculation process? Thanks, Jackie Chan -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Oct 19 12:19:20 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 19 Oct 2022 13:19:20 -0400 Subject: [petsc-users] Some questions about KSP type and PC type selection In-Reply-To: <2bcde0e5.691c.183f0de7611.Coremail.chenlonglong0099@163.com> References: <2bcde0e5.691c.183f0de7611.Coremail.chenlonglong0099@163.com> Message-ID: On Wed, Oct 19, 2022 at 1:04 PM Jackie Chan wrote: > Dear All? > > I hope you're having a nice day. > In finite element problems, the stiffness matrix and load vector are > constructed to calculate the displacement vector using DMCreateMatrix and > DMCreateGlobalVector, respectively. For some reason, I need to make sure > that the displacements of nodes on the opposite edges of two-dimensional > structured grid domain satisfy relative displacement condition. For > example, the displacements of two points with natural coordinates (x,Ymin) > and (x,Ymax), i.e. the points are located on the upper and lower edges > of 2D grid domain and have the same x-coordinates, are equal respectively. > To achieve this, I need to add particularly big numbers to specific entries > in stiffness matrix. In this way, the positions of big numbers are usually > far away from each other and belong to different processes. I have tried > many KSP types and PC settings to solve displacement vector. However, the > final results are mostly incorrect. The best solver type I have tried is > cg, but it still has some problems like excessive time consuming and > convergence steps. So, for this problem, what kind of KSP type and > PC type are suitable? Or is there a way to speed up the calculation process? > Do you really just want a periodic domain? It seems like you should just make those identified nodes the same. Thanks, Matt > Thanks, > Jackie Chan > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre at joliv.et Wed Oct 19 15:22:28 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Wed, 19 Oct 2022 22:22:28 +0200 Subject: [petsc-users] Trouble with ISEmbed In-Reply-To: References: Message-ID: Sorry, I?m not very proficient in petsc4py, and there are a bunch of interfaces missing, e.g., ISShift(), so it may not be optimal, but I hope you?ll understand. First, you?ll need to regenerate the .bin by uncommenting the proper part of the code. That is because you were initially generating a 20x20 matrix, with 4 fields per unknown. That?s 5 unknowns, and so, with two processes, 10 rows per process is not consistent as 10/4 is not an integer ? I don?t know how to force, in petsc4py, the local size to 12 on process #0 and 8 on process #1. The modified code generates a 16x16 matrices so it remains consistent. If you then run the first part of the program, you?ll get both B_uu and B_pp from B instead of A, with one, two, or four processes. Again, that should work for arbitrary number of processes, you just need to be careful that your local dimensions are consistent with the number of fields. Thanks, Pierre > On 19 Oct 2022, at 5:01 PM, Pierre Jolivet wrote: > > > >> On 19 Oct 2022, at 4:32 PM, TARDIEU Nicolas > wrote: >> >> Dear Pierre, >> >> Thank you very much for your answer. I have the same explanation as you for the code I sent. >> But what I would like to do is the following : I have the full matrix A with fields u, p and t (which are interlaced in the real application). I want to extract B=A(u+p, u+p). *Then* I would like to extract the (u, u) block from B - let us call it B_uu. >> In fact, B_uu=A_uu but I really need to do the extraction from B. >> And I am missing something since I have to play with different numberings when switching the IS from A to B. >> >> Is it clear enough ???? > > That?s cristal clear. > If the fields are interlaced, that?s actually easier to do, because you preserve the distribution, and there is less data movement. > I?ll try to fix your code in the case where the fields are interlaced if now one gives you another answer in the meantime. > > Thanks, > Pierre > >> Regards, >> Nicolas >> -- >> Nicolas Tardieu >> Ing PhD Computational Mechanics >> EDF - R&D Dpt ERMES >> PARIS-SACLAY, FRANCE >> De : pierre at joliv.et > >> Envoy? : mercredi 19 octobre 2022 14:51 >> ? : TARDIEU Nicolas > >> Cc : petsc-users at mcs.anl.gov > >> Objet : Re: [petsc-users] Trouble with ISEmbed >> >> On two processes, you have a different distribution for u and u+p. >> IS Object: 2 MPI processes >> type: general >> [0] Number of indices in set 5 >> [0] 0 0 >> [0] 1 1 >> [0] 2 2 >> [0] 3 3 >> [0] 4 4 >> [1] Number of indices in set 5 >> [1] 0 5 >> [1] 1 6 >> [1] 2 7 >> [1] 3 8 >> [1] 4 9 >> IS Object: 2 MPI processes >> type: general >> [0] Number of indices in set 8 >> [0] 0 0 >> [0] 1 1 >> [0] 2 2 >> [0] 3 3 >> [0] 4 4 >> [0] 5 5 >> [0] 6 6 >> [0] 7 7 >> [1] Number of indices in set 7 >> [1] 0 8 >> [1] 1 9 >> [1] 2 10 >> [1] 3 11 >> [1] 4 12 >> [1] 5 13 >> [1] 6 14 >> ISEmbed() only works on local indices, so when you embed u into u+p, on the second process, you miss the row/column indices 5, 6, 7 of B = A(u+p, u+p). >> Thus, you end up with a matrix of dimension size(u) - 3 = 10 - 3 = 7, with just the row/column indices 8 and 9 being selected by the second process. >> What is it that you want to do exactly? Play with ISEmbed(), or get A(u, u) without using A but B instead? >> >> Thanks, >> Pierre >> >> > On 19 Oct 2022, at 12:00 PM, TARDIEU Nicolas via petsc-users > wrote: >> > >> > Dear PETSc Team, >> > >> > I am trying to use IS embeding in parallel. >> > In order to (try to) understand how it works, I have built a simple example, attached to this email. >> > >> > I consider a 20X20 matrix. The dof (u, p, t) in global numbering are the following : >> > u: 0..9 p: 10..14 t: 15..19 >> > >> > I have defined 4 IS to describe the dof u, p, t and the agglomeration of u and p, called up. >> > I first extract the submatrix matrix(up,up), then I would like to extract from it the (u,u) block. >> > >> > The example runs OK in sequential but I do not obtain the (u,u) block on 2 processes. >> > >> > I have a mistake in the build of the sub-IS but I cannot find it for days. >> > >> > Best regards, >> > Nicolas >> > -- >> > Nicolas Tardieu >> > Ing PhD Computational Mechanics >> > EDF - R&D Dpt ERMES >> > PARIS-SACLAY, FRANCE >> > >> > Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. >> > Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. >> > Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. >> > ____________________________________________________ >> > This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. >> > If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. >> > E-mail communication cannot be guaranteed to be timely secure, error or virus-free. >> > >> >> >> Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. >> Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. >> Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. >> ____________________________________________________ >> This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. >> If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. >> E-mail communication cannot be guaranteed to be timely secure, error or virus-free. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: CPR.py Type: text/x-python-script Size: 2124 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: From yuanxi at advancesoft.jp Wed Oct 19 19:50:46 2022 From: yuanxi at advancesoft.jp (=?UTF-8?B?6KKB54WV?=) Date: Thu, 20 Oct 2022 09:50:46 +0900 Subject: [petsc-users] how to reuse Mumps factorization In-Reply-To: <92F8CDFD-572A-4889-9F06-07EBE429765A@petsc.dev> References: <92F8CDFD-572A-4889-9F06-07EBE429765A@petsc.dev> Message-ID: Got it. Thanks for your detailed explanation. YUAN 2022?10?19?(?) 23:58 Barry Smith : > > Every time a matrix entry gets changes PETSc tracks these changes so > for the next KSP by default solve it repeats the numerical factorization if > the matrix has changed. Otherwise it reuses the still current > factorization. > > If you are calling KSP directly, you can call > KSPSetReusePreconditioner() to prevent KSP from automatically performing a > new factorization, so it will use the out-of-date preconditioner but if you > use a KSPType of, for example, KSPGMRES, it will still solve the linear > system correctly just taking some iterations. Reusing the preconditioner > can be faster if the matrix does not change too much since a numerical > factorization takes a lot of time > > If you use SNES you can control "lagging" the preconditioner > with SNESSetLagPreconditioner() > > Barry > > > On Oct 19, 2022, at 9:15 AM, Matthew Knepley wrote: > > On Wed, Oct 19, 2022 at 9:13 AM ?? wrote: > >> Hello, >> >> I am using Mumps to solve a problem with multiple time steps. The matrix >> structure does not change but its value may or may not change during >> those steps. That means I should reuse the symbolic factorization but >> recall numeric factorization when needed. >> >> I have found the following anwser of a similar question >> https://lists.mcs.anl.gov/pipermail/petsc-users/2020-August/041846.html >> >> which says "it automatically uses the same factorization", but I don't >> know if it includes numerical factorization also. >> >> My question is : >> 1. Does numeric factorization do automatically? If not >> > > Yes. > > Thanks, > > Matt > > >> 2. Could I control when numeric factorization should be done and how to >> do it? >> >> Much thanks >> >> YUAN >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas.tardieu at edf.fr Thu Oct 20 04:45:01 2022 From: nicolas.tardieu at edf.fr (TARDIEU Nicolas) Date: Thu, 20 Oct 2022 09:45:01 +0000 Subject: [petsc-users] Trouble with ISEmbed In-Reply-To: References: Message-ID: Dear Pierre, You fixed the problem! Thank you warmly for your precious help. Best regards, Nicolas -- Nicolas Tardieu Ing PhD Computational Mechanics EDF - R&D Dpt ERMES PARIS-SACLAY, FRANCE ________________________________ De : pierre at joliv.et Envoy? : mercredi 19 octobre 2022 22:22 ? : TARDIEU Nicolas Cc : petsc-users at mcs.anl.gov Objet : Re: [petsc-users] Trouble with ISEmbed Sorry, I?m not very proficient in petsc4py, and there are a bunch of interfaces missing, e.g., ISShift(), so it may not be optimal, but I hope you?ll understand. First, you?ll need to regenerate the .bin by uncommenting the proper part of the code. That is because you were initially generating a 20x20 matrix, with 4 fields per unknown. That?s 5 unknowns, and so, with two processes, 10 rows per process is not consistent as 10/4 is not an integer ? I don?t know how to force, in petsc4py, the local size to 12 on process #0 and 8 on process #1. The modified code generates a 16x16 matrices so it remains consistent. If you then run the first part of the program, you?ll get both B_uu and B_pp from B instead of A, with one, two, or four processes. Again, that should work for arbitrary number of processes, you just need to be careful that your local dimensions are consistent with the number of fields. Thanks, Pierre On 19 Oct 2022, at 5:01 PM, Pierre Jolivet > wrote: On 19 Oct 2022, at 4:32 PM, TARDIEU Nicolas > wrote: Dear Pierre, Thank you very much for your answer. I have the same explanation as you for the code I sent. But what I would like to do is the following : I have the full matrix A with fields u, p and t (which are interlaced in the real application). I want to extract B=A(u+p, u+p). *Then* I would like to extract the (u, u) block from B - let us call it B_uu. In fact, B_uu=A_uu but I really need to do the extraction from B. And I am missing something since I have to play with different numberings when switching the IS from A to B. Is it clear enough ???? That?s cristal clear. If the fields are interlaced, that?s actually easier to do, because you preserve the distribution, and there is less data movement. I?ll try to fix your code in the case where the fields are interlaced if now one gives you another answer in the meantime. Thanks, Pierre Regards, Nicolas -- Nicolas Tardieu Ing PhD Computational Mechanics EDF - R&D Dpt ERMES PARIS-SACLAY, FRANCE ________________________________ De : pierre at joliv.et > Envoy? : mercredi 19 octobre 2022 14:51 ? : TARDIEU Nicolas > Cc : petsc-users at mcs.anl.gov > Objet : Re: [petsc-users] Trouble with ISEmbed On two processes, you have a different distribution for u and u+p. IS Object: 2 MPI processes type: general [0] Number of indices in set 5 [0] 0 0 [0] 1 1 [0] 2 2 [0] 3 3 [0] 4 4 [1] Number of indices in set 5 [1] 0 5 [1] 1 6 [1] 2 7 [1] 3 8 [1] 4 9 IS Object: 2 MPI processes type: general [0] Number of indices in set 8 [0] 0 0 [0] 1 1 [0] 2 2 [0] 3 3 [0] 4 4 [0] 5 5 [0] 6 6 [0] 7 7 [1] Number of indices in set 7 [1] 0 8 [1] 1 9 [1] 2 10 [1] 3 11 [1] 4 12 [1] 5 13 [1] 6 14 ISEmbed() only works on local indices, so when you embed u into u+p, on the second process, you miss the row/column indices 5, 6, 7 of B = A(u+p, u+p). Thus, you end up with a matrix of dimension size(u) - 3 = 10 - 3 = 7, with just the row/column indices 8 and 9 being selected by the second process. What is it that you want to do exactly? Play with ISEmbed(), or get A(u, u) without using A but B instead? Thanks, Pierre > On 19 Oct 2022, at 12:00 PM, TARDIEU Nicolas via petsc-users > wrote: > > Dear PETSc Team, > > I am trying to use IS embeding in parallel. > In order to (try to) understand how it works, I have built a simple example, attached to this email. > > I consider a 20X20 matrix. The dof (u, p, t) in global numbering are the following : > u: 0..9 p: 10..14 t: 15..19 > > I have defined 4 IS to describe the dof u, p, t and the agglomeration of u and p, called up. > I first extract the submatrix matrix(up,up), then I would like to extract from it the (u,u) block. > > The example runs OK in sequential but I do not obtain the (u,u) block on 2 processes. > > I have a mistake in the build of the sub-IS but I cannot find it for days. > > Best regards, > Nicolas > -- > Nicolas Tardieu > Ing PhD Computational Mechanics > EDF - R&D Dpt ERMES > PARIS-SACLAY, FRANCE > > Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. > Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. > Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. > ____________________________________________________ > This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. > If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. > E-mail communication cannot be guaranteed to be timely secure, error or virus-free. > Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. ____________________________________________________ This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. E-mail communication cannot be guaranteed to be timely secure, error or virus-free. Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. ____________________________________________________ This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. E-mail communication cannot be guaranteed to be timely secure, error or virus-free. -------------- next part -------------- An HTML attachment was scrubbed... URL: From yc17470 at connect.um.edu.mo Thu Oct 20 08:42:32 2022 From: yc17470 at connect.um.edu.mo (Gong Yujie) Date: Thu, 20 Oct 2022 13:42:32 +0000 Subject: [petsc-users] DMPlex adding boundary without PETSc discretization Message-ID: Dear development team, I'm trying to write a code to solve a partial differential equation. I didn't use PETSc's discretization. I have a question about the implementation of the boundary condition (Dirichlet boundary condition). When implementing the Dirichlet boundary condition. I need to set the corresponding rows in the Jacobian matrix diagonal 1 and others 0. I created the matrix use DMCreateMatrix. Can I add the Dirichlet boundary condition to the DMPlex mesh so that the Jacobian matrix won't contain these Dirichlet boundary rows? If yes, is there a function that can set the Dirichlet boundary points' value? Best Regards, Jerry -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Oct 20 08:47:47 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 20 Oct 2022 09:47:47 -0400 Subject: [petsc-users] DMPlex adding boundary without PETSc discretization In-Reply-To: References: Message-ID: On Thu, Oct 20, 2022 at 9:42 AM Gong Yujie wrote: > Dear development team, > > I'm trying to write a code to solve a partial differential equation. I > didn't use PETSc's discretization. I have a question about the > implementation of the boundary condition (Dirichlet boundary condition). > When implementing the Dirichlet boundary condition. I need to set the > corresponding rows in the Jacobian matrix diagonal 1 and others 0. I > created the matrix use DMCreateMatrix. > For this, you can use https://petsc.org/main/docs/manualpages/Mat/MatZeroRows/ https://petsc.org/main/docs/manualpages/Mat/MatZeroRowsColumns/ > Can I add the Dirichlet boundary condition to the DMPlex mesh so that the > Jacobian matrix won't contain these Dirichlet boundary rows? If yes, is > there a function that can set the Dirichlet boundary points' value? > If you are not using the PETSc discretization, that means you gave a PetscSection to the DMPlex so that it could make the Mat. You can add constraints to that Section in order to eliminate some unknowns. Thanks, Matt > Best Regards, > Jerry > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kavousi at mines.edu Thu Oct 20 17:45:37 2022 From: kavousi at mines.edu (Sepideh Kavousi) Date: Thu, 20 Oct 2022 22:45:37 +0000 Subject: [petsc-users] Periodic boundary condition Message-ID: Hello, I want to solve my 5 PDEs based on finite difference method using periodic BC in x-direction and non-periodic in y-direction but I run into error (Segmentation Violation, probably memory access out of range). For this, I discretize my equation in FormFunction function. My PDE discretization in (i,j) node needs data on (i+1,j), (i+2,j), (i-1,j), (i-2,j), (i,j+1), (i,j+2), (i,j-1), (i,j-2) points. In my previous codes that the x-direction was non-periodic (no flux) boundary condition, I: i) implemented the no flux BC for i=0 and i=Nx-1, ii) set i+2= Nx-1 in discretizing (Nx-2,j) and i+2= 0 in discretizing (1,j) iii) discretized my equation for i=1..Nx-2. I am not sure how I should do the periodic BC. From the following discussions (https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html and https://lists.mcs.anl.gov/pipermail/petsc-users/2016-May/029273.html), I guess I should not do step (i) (stated above) for the x-boundaries and just do step (iii) for i=0..Nx-1. If I just focus on solving 2 of the PDEs which does need data on (i+2,j), (i-2,j), (i,j+2), (i,j-2) points for discretizing equation in (i,j) node, I still run into error: Running with Valgrind (just 1 processor) gave the following file. I did not find any information which gives me hint on the error source. Can you please help me to find the error? Best, Sepideh ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x4C29E39: malloc (vg_replace_malloc.c:309) ==236074== by 0x1B79E59B: MPID_Init (mpid_init.c:1649) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1B1DA260: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1B1DA383: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E48032E: __intel_sse4_strcpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FD8BE: PetscStrcpy (str.c:354) ==236074== by 0x51FD7A3: PetscStrallocpy (str.c:188) ==236074== by 0x52A39CE: PetscEventRegLogRegister (eventlog.c:313) ==236074== by 0x527D89A: PetscLogEventRegister (plog.c:693) ==236074== by 0x6A56A20: PCBDDCInitializePackage (bddc.c:3115) ==236074== by 0x6E1A515: PCInitializePackage (dlregisksp.c:59) ==236074== by 0x6DB1A86: PCCreate (precon.c:382) ==236074== by 0x6E05167: KSPGetPC (itfunc.c:1837) ==236074== by 0x6E0FC5C: KSPSetDM (iterativ.c:1150) ==236074== by 0x6FDD27B: SNESSetDM (snes.c:5402) ==236074== by 0x70B85F7: TSGetSNES (ts.c:2914) ==236074== by 0x70BE430: TSSetDM (ts.c:4949) ==236074== by 0x402496: main (one.c:378) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) ==236074== by 0x52C9958: PetscViewerFileSetName (filev.c:659) ==236074== by 0x52B743B: PetscViewerVTKOpen (vtkv.c:279) ==236074== by 0x70C76E6: TSMonitorSolutionVTK (ts.c:5580) ==236074== by 0x40313C: FormFunction (one.c:120) ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) ==236074== by 0x5224E4B: PetscFOpen (mpiuopen.c:52) ==236074== by 0x63A074B: DMDAVTKWriteAll_VTS.A (grvtk.c:72) ==236074== by 0x639A589: DMDAVTKWriteAll (grvtk.c:545) ==236074== by 0x52B66F3: PetscViewerFlush_VTK (vtkv.c:100) ==236074== by 0x52CFAAE: PetscViewerFlush (flush.c:26) ==236074== by 0x52CEA95: PetscViewerDestroy (view.c:113) ==236074== by 0x70C7717: TSMonitorSolutionVTK (ts.c:5582) ==236074== by 0x40313C: FormFunction (one.c:120) ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Invalid write of size 4 ==236074== at 0x5F10983: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:150) ==236074== by 0x5F10983: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Address 0x3a94fa80 is 0 bytes after a block of size 73,960,000 alloc'd ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) ==236074== by 0x52305F9: PetscMallocA (mal.c:418) ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Invalid write of size 8 ==236074== at 0x5F10991: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:151) ==236074== by 0x5F10991: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Address 0x3a94fa88 is 8 bytes after a block of size 73,960,000 alloc'd ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) ==236074== by 0x52305F9: PetscMallocA (mal.c:418) ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Sent from Mail for Windows -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Oct 20 18:39:18 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 20 Oct 2022 19:39:18 -0400 Subject: [petsc-users] Periodic boundary condition In-Reply-To: References: Message-ID: On Thu, Oct 20, 2022 at 6:48 PM Sepideh Kavousi wrote: > Hello, > > I want to solve my 5 PDEs based on finite difference method using > periodic BC in x-direction and non-periodic in y-direction but I run into > error (Segmentation Violation, probably memory access out of range). > > For this, I discretize my equation in FormFunction function. My PDE > discretization in (i,j) node needs data on (i+1,j), (i+2,j), (i-1,j), > (i-2,j), (i,j+1), (i,j+2), (i,j-1), (i,j-2) points. > > In my previous codes that the x-direction was non-periodic (no flux) > boundary condition, I: > > i) implemented the no flux BC for i=0 and i=Nx-1, > > ii) set i+2= Nx-1 in discretizing (Nx-2,j) and i+2= 0 > in discretizing (1,j) > > iii) discretized my equation for i=1..Nx-2. > > I am not sure how I should do the periodic BC. From the following > discussions ( > https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html and > https://lists.mcs.anl.gov/pipermail/petsc-users/2016-May/029273.html), I > guess I should not do step (i) (stated above) for the x-boundaries and just > do step (iii) for i=0..Nx-1. If I just focus on solving 2 of the PDEs which > does need data on (i+2,j), (i-2,j), (i,j+2), (i,j-2) points for > discretizing equation in (i,j) node, I still run into error: > > Running with Valgrind (just 1 processor) gave the following file. I did > not find any information which gives me hint on the error source. > > Can you please help me to find the error? > It sounds like you are accessing the array outside your declared stencil. Do you have anything we can run? Thanks, Matt > Best, > > Sepideh > > > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x4C29E39: malloc (vg_replace_malloc.c:309) > > ==236074== by 0x1B79E59B: MPID_Init (mpid_init.c:1649) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/ > libdl-2.17.so) > > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/ > libdl-2.17.so) > > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/ > libdl-2.17.so) > > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Use of uninitialised value of size 8 > > ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/ > libdl-2.17.so) > > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Use of uninitialised value of size 8 > > ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/ > libdl-2.17.so) > > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2112D7E6: psmx2_getinfo (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2112D7E6: psmx2_getinfo (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2112D7E6: psmx2_getinfo (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Use of uninitialised value of size 8 > > ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2112D7E6: psmx2_getinfo (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Use of uninitialised value of size 8 > > ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2112D7E6: psmx2_getinfo (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x21837077: hfi_get_port_lid (in > /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x21138089: psmx2_ep_open (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x21837077: hfi_get_port_lid (in > /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x21138089: psmx2_ep_open (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x21837077: hfi_get_port_lid (in > /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x21138089: psmx2_ep_open (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x21837077: hfi_get_port_lid (in > /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x21138089: psmx2_ep_open (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x21837077: hfi_get_port_lid (in > /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x21138089: psmx2_ep_open (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x21837077: hfi_get_port_lid (in > /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x21138089: psmx2_ep_open (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x1B1DA260: __I_MPI___intel_sse2_strncmp (in > /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) > > ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) > > ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) > > ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) > > ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) > > ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) > > ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) > > ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x1B1DA383: __I_MPI___intel_sse2_strncmp (in > /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) > > ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) > > ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) > > ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) > > ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) > > ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) > > ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) > > ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) > > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > > ==236074== by 0x40219D: main (one.c:335) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x1E48032E: __intel_sse4_strcpy (in > /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > > ==236074== by 0x51FD8BE: PetscStrcpy (str.c:354) > > ==236074== by 0x51FD7A3: PetscStrallocpy (str.c:188) > > ==236074== by 0x52A39CE: PetscEventRegLogRegister (eventlog.c:313) > > ==236074== by 0x527D89A: PetscLogEventRegister (plog.c:693) > > ==236074== by 0x6A56A20: PCBDDCInitializePackage (bddc.c:3115) > > ==236074== by 0x6E1A515: PCInitializePackage (dlregisksp.c:59) > > ==236074== by 0x6DB1A86: PCCreate (precon.c:382) > > ==236074== by 0x6E05167: KSPGetPC (itfunc.c:1837) > > ==236074== by 0x6E0FC5C: KSPSetDM (iterativ.c:1150) > > ==236074== by 0x6FDD27B: SNESSetDM (snes.c:5402) > > ==236074== by 0x70B85F7: TSGetSNES (ts.c:2914) > > ==236074== by 0x70BE430: TSSetDM (ts.c:4949) > > ==236074== by 0x402496: main (one.c:378) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in > /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > > ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) > > ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) > > ==236074== by 0x52C9958: PetscViewerFileSetName (filev.c:659) > > ==236074== by 0x52B743B: PetscViewerVTKOpen (vtkv.c:279) > > ==236074== by 0x70C76E6: TSMonitorSolutionVTK (ts.c:5580) > > ==236074== by 0x40313C: FormFunction (one.c:120) > > ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) > > ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) > > ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) > > ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) > > ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) > > ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) > > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > > ==236074== by 0x70C363A: TSStep (ts.c:3757) > > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > > ==236074== by 0x402594: main (one.c:391) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in > /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > > ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) > > ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) > > ==236074== by 0x5224E4B: PetscFOpen (mpiuopen.c:52) > > ==236074== by 0x63A074B: DMDAVTKWriteAll_VTS.A (grvtk.c:72) > > ==236074== by 0x639A589: DMDAVTKWriteAll (grvtk.c:545) > > ==236074== by 0x52B66F3: PetscViewerFlush_VTK (vtkv.c:100) > > ==236074== by 0x52CFAAE: PetscViewerFlush (flush.c:26) > > ==236074== by 0x52CEA95: PetscViewerDestroy (view.c:113) > > ==236074== by 0x70C7717: TSMonitorSolutionVTK (ts.c:5582) > > ==236074== by 0x40313C: FormFunction (one.c:120) > > ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) > > ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) > > ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) > > ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) > > ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) > > ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) > > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > > ==236074== by 0x70C363A: TSStep (ts.c:3757) > > ==236074== > > ==236074== Conditional jump or move depends on uninitialised value(s) > > ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private > (fdaij.c:146) > > ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > > ==236074== by 0x70C363A: TSStep (ts.c:3757) > > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > > ==236074== by 0x402594: main (one.c:391) > > ==236074== > > ==236074== Invalid write of size 4 > > ==236074== at 0x5F10983: MatFDColoringSetUpBlocked_AIJ_Private > (fdaij.c:150) > > ==236074== by 0x5F10983: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > > ==236074== by 0x70C363A: TSStep (ts.c:3757) > > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > > ==236074== by 0x402594: main (one.c:391) > > ==236074== Address 0x3a94fa80 is 0 bytes after a block of size 73,960,000 > alloc'd > > ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) > > ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) > > ==236074== by 0x52305F9: PetscMallocA (mal.c:418) > > ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private > (fdaij.c:125) > > ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > > ==236074== by 0x70C363A: TSStep (ts.c:3757) > > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > > ==236074== by 0x402594: main (one.c:391) > > ==236074== > > ==236074== Invalid write of size 8 > > ==236074== at 0x5F10991: MatFDColoringSetUpBlocked_AIJ_Private > (fdaij.c:151) > > ==236074== by 0x5F10991: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > > ==236074== by 0x70C363A: TSStep (ts.c:3757) > > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > > ==236074== by 0x402594: main (one.c:391) > > ==236074== Address 0x3a94fa88 is 8 bytes after a block of size 73,960,000 > alloc'd > > ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) > > ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) > > ==236074== by 0x52305F9: PetscMallocA (mal.c:418) > > ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private > (fdaij.c:125) > > ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > > ==236074== by 0x70C363A: TSStep (ts.c:3757) > > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > > ==236074== by 0x402594: main (one.c:391) > > ==236074== > > > > > > Sent from Mail for > Windows > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kavousi at mines.edu Thu Oct 20 18:48:57 2022 From: kavousi at mines.edu (Sepideh Kavousi) Date: Thu, 20 Oct 2022 23:48:57 +0000 Subject: [petsc-users] [External] Re: Periodic boundary condition In-Reply-To: References: Message-ID: I appreciate your help. Please find attached. I ma solving phase field solidification equation+Navier stokes. The unknowns are velocity in x (vx), velocity in y (vy), pressure (pp), order parameter (p), concentration field (U). I have postponed solution of navier stokes and just focused on solving for p and U. Best, sepideh Sent from Mail for Windows From: Matthew Knepley Sent: Thursday, October 20, 2022 7:39 PM To: Sepideh Kavousi Cc: petsc-users at mcs.anl.gov Subject: [External] Re: [petsc-users] Periodic boundary condition On Thu, Oct 20, 2022 at 6:48 PM Sepideh Kavousi > wrote: Hello, I want to solve my 5 PDEs based on finite difference method using periodic BC in x-direction and non-periodic in y-direction but I run into error (Segmentation Violation, probably memory access out of range). For this, I discretize my equation in FormFunction function. My PDE discretization in (i,j) node needs data on (i+1,j), (i+2,j), (i-1,j), (i-2,j), (i,j+1), (i,j+2), (i,j-1), (i,j-2) points. In my previous codes that the x-direction was non-periodic (no flux) boundary condition, I: i) implemented the no flux BC for i=0 and i=Nx-1, ii) set i+2= Nx-1 in discretizing (Nx-2,j) and i+2= 0 in discretizing (1,j) iii) discretized my equation for i=1..Nx-2. I am not sure how I should do the periodic BC. From the following discussions (https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html and https://lists.mcs.anl.gov/pipermail/petsc-users/2016-May/029273.html), I guess I should not do step (i) (stated above) for the x-boundaries and just do step (iii) for i=0..Nx-1. If I just focus on solving 2 of the PDEs which does need data on (i+2,j), (i-2,j), (i,j+2), (i,j-2) points for discretizing equation in (i,j) node, I still run into error: Running with Valgrind (just 1 processor) gave the following file. I did not find any information which gives me hint on the error source. Can you please help me to find the error? It sounds like you are accessing the array outside your declared stencil. Do you have anything we can run? Thanks, Matt Best, Sepideh ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x4C29E39: malloc (vg_replace_malloc.c:309) ==236074== by 0x1B79E59B: MPID_Init (mpid_init.c:1649) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1B1DA260: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1B1DA383: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E48032E: __intel_sse4_strcpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FD8BE: PetscStrcpy (str.c:354) ==236074== by 0x51FD7A3: PetscStrallocpy (str.c:188) ==236074== by 0x52A39CE: PetscEventRegLogRegister (eventlog.c:313) ==236074== by 0x527D89A: PetscLogEventRegister (plog.c:693) ==236074== by 0x6A56A20: PCBDDCInitializePackage (bddc.c:3115) ==236074== by 0x6E1A515: PCInitializePackage (dlregisksp.c:59) ==236074== by 0x6DB1A86: PCCreate (precon.c:382) ==236074== by 0x6E05167: KSPGetPC (itfunc.c:1837) ==236074== by 0x6E0FC5C: KSPSetDM (iterativ.c:1150) ==236074== by 0x6FDD27B: SNESSetDM (snes.c:5402) ==236074== by 0x70B85F7: TSGetSNES (ts.c:2914) ==236074== by 0x70BE430: TSSetDM (ts.c:4949) ==236074== by 0x402496: main (one.c:378) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) ==236074== by 0x52C9958: PetscViewerFileSetName (filev.c:659) ==236074== by 0x52B743B: PetscViewerVTKOpen (vtkv.c:279) ==236074== by 0x70C76E6: TSMonitorSolutionVTK (ts.c:5580) ==236074== by 0x40313C: FormFunction (one.c:120) ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) ==236074== by 0x5224E4B: PetscFOpen (mpiuopen.c:52) ==236074== by 0x63A074B: DMDAVTKWriteAll_VTS.A (grvtk.c:72) ==236074== by 0x639A589: DMDAVTKWriteAll (grvtk.c:545) ==236074== by 0x52B66F3: PetscViewerFlush_VTK (vtkv.c:100) ==236074== by 0x52CFAAE: PetscViewerFlush (flush.c:26) ==236074== by 0x52CEA95: PetscViewerDestroy (view.c:113) ==236074== by 0x70C7717: TSMonitorSolutionVTK (ts.c:5582) ==236074== by 0x40313C: FormFunction (one.c:120) ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Invalid write of size 4 ==236074== at 0x5F10983: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:150) ==236074== by 0x5F10983: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Address 0x3a94fa80 is 0 bytes after a block of size 73,960,000 alloc'd ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) ==236074== by 0x52305F9: PetscMallocA (mal.c:418) ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Invalid write of size 8 ==236074== at 0x5F10991: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:151) ==236074== by 0x5F10991: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Address 0x3a94fa88 is 8 bytes after a block of size 73,960,000 alloc'd ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) ==236074== by 0x52305F9: PetscMallocA (mal.c:418) ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Sent from Mail for Windows -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: common.c URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: common.h URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: makefile Type: application/octet-stream Size: 239 bytes Desc: makefile URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: one.c URL: From zhaog6 at lsec.cc.ac.cn Thu Oct 20 20:35:38 2022 From: zhaog6 at lsec.cc.ac.cn (=?UTF-8?B?6LW15Yia?=) Date: Fri, 21 Oct 2022 09:35:38 +0800 (GMT+08:00) Subject: [petsc-users] An issue of Interior-Point Methods in TAO In-Reply-To: <1DFAEEDE-F238-49EB-B3A9-F8ACED478DB0@petsc.dev> References: <52f60c40.cfe0.183ef42e3f1.Coremail.zhaog6@lsec.cc.ac.cn> <1DFAEEDE-F238-49EB-B3A9-F8ACED478DB0@petsc.dev> Message-ID: I see. Thank you for your promptly reply. Best Regards, Gang > -----????----- > ???: "Barry Smith" > ????: 2022-10-19 22:50:16 (???) > ???: "??" , "Munson, Todd" > ??: petsc-users at mcs.anl.gov > ??: Re: [petsc-users] An issue of Interior-Point Methods in TAO > > > It looks like it was started as a general framework for interior point methods but never finished? > > There is a note on its manual page "This algorithm is more of a place-holder for future constrained optimization algorithms and should not yet be used for large problems or production code." > > You are welcome to look at the source code and play with it but you should consider it unfinished and unsupported. > > Barry > > > > On Oct 19, 2022, at 4:01 AM, ?? wrote: > > > > Dear PETSc/TAO team, > > > > I am using an interior-point method in TAO to solve a quadratic programming problem with bound constraints. I noticed that TAO includes three interior-point methods, they are Mehrotra Predictor-Corrector Method (bqpip), Primal-Dual Interior-Point Method (pdipm) and "ipm", respectively. I'd like to ask what is the interior-point method implemented by "-tao_type ipm", thank you. > > > > > > Best Regards, > > Gang From bsmith at petsc.dev Thu Oct 20 21:26:45 2022 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 20 Oct 2022 22:26:45 -0400 Subject: [petsc-users] Periodic boundary condition In-Reply-To: References: Message-ID: Some of the valgrind information does not appear to make sense PetscMemcpy() is not calling SNESSolve() so I suspect there must be some serious corruption of something to this impossible stack trace ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) From ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) I suggest you run with -malloc_debug instead of valgrind and see if any errors are reported. If so you can add the macro CHKMEMQ; inside your function evaluation where you write to memory to see if anything is writing to the wrong location. For example wherever you assign aF such as aF[j][i].vx=(x3+x4+x5+x6+x7+x8+x9-x1-x2)*user->hx; this can help you determine the exact line number where you are writing to the wrong location and determine what might be the cause. > On Oct 20, 2022, at 6:45 PM, Sepideh Kavousi wrote: > > Hello, > I want to solve my 5 PDEs based on finite difference method using periodic BC in x-direction and non-periodic in y-direction but I run into error (Segmentation Violation, probably memory access out of range). > For this, I discretize my equation in FormFunction function. My PDE discretization in (i,j) node needs data on (i+1,j), (i+2,j), (i-1,j), (i-2,j), (i,j+1), (i,j+2), (i,j-1), (i,j-2) points. > In my previous codes that the x-direction was non-periodic (no flux) boundary condition, I: > i) implemented the no flux BC for i=0 and i=Nx-1, > ii) set i+2= Nx-1 in discretizing (Nx-2,j) and i+2= 0 in discretizing (1,j) > iii) discretized my equation for i=1..Nx-2. > I am not sure how I should do the periodic BC. From the following discussions (https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html andhttps://lists.mcs.anl.gov/pipermail/petsc-users/2016-May/029273.html ), I guess I should not do step (i) (stated above) for the x-boundaries and just do step (iii) for i=0..Nx-1. If I just focus on solving 2 of the PDEs which does need data on (i+2,j), (i-2,j), (i,j+2), (i,j-2) points for discretizing equation in (i,j) node, I still run into error: > Running with Valgrind (just 1 processor) gave the following file. I did not find any information which gives me hint on the error source. > Can you please help me to find the error? > Best, > Sepideh > > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x4C29E39: malloc (vg_replace_malloc.c:309) > ==236074== by 0x1B79E59B: MPID_Init (mpid_init.c:1649) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1B1DA260: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) > ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) > ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) > ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) > ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) > ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) > ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) > ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1B1DA383: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) > ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) > ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) > ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) > ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) > ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) > ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) > ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1E48032E: __intel_sse4_strcpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > ==236074== by 0x51FD8BE: PetscStrcpy (str.c:354) > ==236074== by 0x51FD7A3: PetscStrallocpy (str.c:188) > ==236074== by 0x52A39CE: PetscEventRegLogRegister (eventlog.c:313) > ==236074== by 0x527D89A: PetscLogEventRegister (plog.c:693) > ==236074== by 0x6A56A20: PCBDDCInitializePackage (bddc.c:3115) > ==236074== by 0x6E1A515: PCInitializePackage (dlregisksp.c:59) > ==236074== by 0x6DB1A86: PCCreate (precon.c:382) > ==236074== by 0x6E05167: KSPGetPC (itfunc.c:1837) > ==236074== by 0x6E0FC5C: KSPSetDM (iterativ.c:1150) > ==236074== by 0x6FDD27B: SNESSetDM (snes.c:5402) > ==236074== by 0x70B85F7: TSGetSNES (ts.c:2914) > ==236074== by 0x70BE430: TSSetDM (ts.c:4949) > ==236074== by 0x402496: main (one.c:378) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) > ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) > ==236074== by 0x52C9958: PetscViewerFileSetName (filev.c:659) > ==236074== by 0x52B743B: PetscViewerVTKOpen (vtkv.c:279) > ==236074== by 0x70C76E6: TSMonitorSolutionVTK (ts.c:5580) > ==236074== by 0x40313C: FormFunction (one.c:120) > ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) > ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) > ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) > ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) > ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) > ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) > ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) > ==236074== by 0x5224E4B: PetscFOpen (mpiuopen.c:52) > ==236074== by 0x63A074B: DMDAVTKWriteAll_VTS.A (grvtk.c:72) > ==236074== by 0x639A589: DMDAVTKWriteAll (grvtk.c:545) > ==236074== by 0x52B66F3: PetscViewerFlush_VTK (vtkv.c:100) > ==236074== by 0x52CFAAE: PetscViewerFlush (flush.c:26) > ==236074== by 0x52CEA95: PetscViewerDestroy (view.c:113) > ==236074== by 0x70C7717: TSMonitorSolutionVTK (ts.c:5582) > ==236074== by 0x40313C: FormFunction (one.c:120) > ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) > ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) > ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) > ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) > ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) > ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) > ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > ==236074== Invalid write of size 4 > ==236074== at 0x5F10983: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:150) > ==236074== by 0x5F10983: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== Address 0x3a94fa80 is 0 bytes after a block of size 73,960,000 alloc'd > ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) > ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) > ==236074== by 0x52305F9: PetscMallocA (mal.c:418) > ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) > ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > ==236074== Invalid write of size 8 > ==236074== at 0x5F10991: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:151) > ==236074== by 0x5F10991: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== Address 0x3a94fa88 is 8 bytes after a block of size 73,960,000 alloc'd > ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) > ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) > ==236074== by 0x52305F9: PetscMallocA (mal.c:418) > ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) > ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > > > Sent from Mail for Windows -------------- next part -------------- An HTML attachment was scrubbed... URL: From kavousi at mines.edu Thu Oct 20 23:32:36 2022 From: kavousi at mines.edu (Sepideh Kavousi) Date: Fri, 21 Oct 2022 04:32:36 +0000 Subject: [petsc-users] [External] Re: Periodic boundary condition In-Reply-To: References: Message-ID: Barry, I ran the code with -malloc_debug and added CHKMEMQ for all the lines inside formfunction. Following is the detail of error. Best, Sepideh [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run [0]PETSC ERROR: to get more information on the crash. [0]PETSC ERROR: PetscMallocValidate: error detected at PetscError() line 401 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/err.c [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Signal received [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 [0]PETSC ERROR: ./one.out on a skylake named c415-063.stampede2.tacc.utexas.edu by tg863649 Thu Oct 20 23:30:05 2022 [0]PETSC ERROR: Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" [0]PETSC ERROR: #1 User provided function() line 0 in unknown file [0]PETSC ERROR: Checking the memory for corruption. [0]PETSC ERROR: PetscMallocValidate: error detected at PetscSignalHandlerDefault() line 170 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/signal.c [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c application called MPI_Abort(MPI_COMM_WORLD, 50176059) - process 0 [unset]: readline failed Sent from Mail for Windows From: Barry Smith Sent: Thursday, October 20, 2022 10:27 PM To: Sepideh Kavousi Cc: petsc-users at mcs.anl.gov Subject: [External] Re: [petsc-users] Periodic boundary condition Some of the valgrind information does not appear to make sense PetscMemcpy() is not calling SNESSolve() so I suspect there must be some serious corruption of something to this impossible stack trace ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) From ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) I suggest you run with -malloc_debug instead of valgrind and see if any errors are reported. If so you can add the macro CHKMEMQ; inside your function evaluation where you write to memory to see if anything is writing to the wrong location. For example wherever you assign aF such as aF[j][i].vx=(x3+x4+x5+x6+x7+x8+x9-x1-x2)*user->hx; this can help you determine the exact line number where you are writing to the wrong location and determine what might be the cause. On Oct 20, 2022, at 6:45 PM, Sepideh Kavousi > wrote: Hello, I want to solve my 5 PDEs based on finite difference method using periodic BC in x-direction and non-periodic in y-direction but I run into error (Segmentation Violation, probably memory access out of range). For this, I discretize my equation in FormFunction function. My PDE discretization in (i,j) node needs data on (i+1,j), (i+2,j), (i-1,j), (i-2,j), (i,j+1), (i,j+2), (i,j-1), (i,j-2) points. In my previous codes that the x-direction was non-periodic (no flux) boundary condition, I: i) implemented the no flux BC for i=0 and i=Nx-1, ii) set i+2= Nx-1 in discretizing (Nx-2,j) and i+2= 0 in discretizing (1,j) iii) discretized my equation for i=1..Nx-2. I am not sure how I should do the periodic BC. From the following discussions (https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html andhttps://lists.mcs.anl.gov/pipermail/petsc-users/2016-May/029273.html), I guess I should not do step (i) (stated above) for the x-boundaries and just do step (iii) for i=0..Nx-1. If I just focus on solving 2 of the PDEs which does need data on (i+2,j), (i-2,j), (i,j+2), (i,j-2) points for discretizing equation in (i,j) node, I still run into error: Running with Valgrind (just 1 processor) gave the following file. I did not find any information which gives me hint on the error source. Can you please help me to find the error? Best, Sepideh ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x4C29E39: malloc (vg_replace_malloc.c:309) ==236074== by 0x1B79E59B: MPID_Init (mpid_init.c:1649) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1B1DA260: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1B1DA383: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E48032E: __intel_sse4_strcpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FD8BE: PetscStrcpy (str.c:354) ==236074== by 0x51FD7A3: PetscStrallocpy (str.c:188) ==236074== by 0x52A39CE: PetscEventRegLogRegister (eventlog.c:313) ==236074== by 0x527D89A: PetscLogEventRegister (plog.c:693) ==236074== by 0x6A56A20: PCBDDCInitializePackage (bddc.c:3115) ==236074== by 0x6E1A515: PCInitializePackage (dlregisksp.c:59) ==236074== by 0x6DB1A86: PCCreate (precon.c:382) ==236074== by 0x6E05167: KSPGetPC (itfunc.c:1837) ==236074== by 0x6E0FC5C: KSPSetDM (iterativ.c:1150) ==236074== by 0x6FDD27B: SNESSetDM (snes.c:5402) ==236074== by 0x70B85F7: TSGetSNES (ts.c:2914) ==236074== by 0x70BE430: TSSetDM (ts.c:4949) ==236074== by 0x402496: main (one.c:378) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) ==236074== by 0x52C9958: PetscViewerFileSetName (filev.c:659) ==236074== by 0x52B743B: PetscViewerVTKOpen (vtkv.c:279) ==236074== by 0x70C76E6: TSMonitorSolutionVTK (ts.c:5580) ==236074== by 0x40313C: FormFunction (one.c:120) ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) ==236074== by 0x5224E4B: PetscFOpen (mpiuopen.c:52) ==236074== by 0x63A074B: DMDAVTKWriteAll_VTS.A (grvtk.c:72) ==236074== by 0x639A589: DMDAVTKWriteAll (grvtk.c:545) ==236074== by 0x52B66F3: PetscViewerFlush_VTK (vtkv.c:100) ==236074== by 0x52CFAAE: PetscViewerFlush (flush.c:26) ==236074== by 0x52CEA95: PetscViewerDestroy (view.c:113) ==236074== by 0x70C7717: TSMonitorSolutionVTK (ts.c:5582) ==236074== by 0x40313C: FormFunction (one.c:120) ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Invalid write of size 4 ==236074== at 0x5F10983: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:150) ==236074== by 0x5F10983: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Address 0x3a94fa80 is 0 bytes after a block of size 73,960,000 alloc'd ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) ==236074== by 0x52305F9: PetscMallocA (mal.c:418) ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Invalid write of size 8 ==236074== at 0x5F10991: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:151) ==236074== by 0x5F10991: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Address 0x3a94fa88 is 8 bytes after a block of size 73,960,000 alloc'd ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) ==236074== by 0x52305F9: PetscMallocA (mal.c:418) ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Sent from Mail for Windows -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas.tardieu at edf.fr Fri Oct 21 03:46:00 2022 From: nicolas.tardieu at edf.fr (TARDIEU Nicolas) Date: Fri, 21 Oct 2022 08:46:00 +0000 Subject: [petsc-users] Trouble with ISEmbed In-Reply-To: References: Message-ID: Dear Pierre, To complete my last post, in fact, the initial code (playing with the LGMap) was correct. It was my test case that was wrong. Once fixed according to your suggestion, everything turns out to be OK. I am nevertheless wondering if this IS embeding according to the global numbering should not be a native PTESc's feature ? Thank you again, Nicolas -- Nicolas Tardieu Ing PhD Computational Mechanics EDF - R&D Dpt ERMES PARIS-SACLAY, FRANCE ________________________________ De : TARDIEU Nicolas Envoy? : jeudi 20 octobre 2022 11:45 ? : pierre at joliv.et Cc : petsc-users at mcs.anl.gov Objet : RE: [petsc-users] Trouble with ISEmbed Dear Pierre, You fixed the problem! Thank you warmly for your precious help. Best regards, Nicolas -- Nicolas Tardieu Ing PhD Computational Mechanics EDF - R&D Dpt ERMES PARIS-SACLAY, FRANCE ________________________________ De : pierre at joliv.et Envoy? : mercredi 19 octobre 2022 22:22 ? : TARDIEU Nicolas Cc : petsc-users at mcs.anl.gov Objet : Re: [petsc-users] Trouble with ISEmbed Sorry, I?m not very proficient in petsc4py, and there are a bunch of interfaces missing, e.g., ISShift(), so it may not be optimal, but I hope you?ll understand. First, you?ll need to regenerate the .bin by uncommenting the proper part of the code. That is because you were initially generating a 20x20 matrix, with 4 fields per unknown. That?s 5 unknowns, and so, with two processes, 10 rows per process is not consistent as 10/4 is not an integer ? I don?t know how to force, in petsc4py, the local size to 12 on process #0 and 8 on process #1. The modified code generates a 16x16 matrices so it remains consistent. If you then run the first part of the program, you?ll get both B_uu and B_pp from B instead of A, with one, two, or four processes. Again, that should work for arbitrary number of processes, you just need to be careful that your local dimensions are consistent with the number of fields. Thanks, Pierre On 19 Oct 2022, at 5:01 PM, Pierre Jolivet > wrote: On 19 Oct 2022, at 4:32 PM, TARDIEU Nicolas > wrote: Dear Pierre, Thank you very much for your answer. I have the same explanation as you for the code I sent. But what I would like to do is the following : I have the full matrix A with fields u, p and t (which are interlaced in the real application). I want to extract B=A(u+p, u+p). *Then* I would like to extract the (u, u) block from B - let us call it B_uu. In fact, B_uu=A_uu but I really need to do the extraction from B. And I am missing something since I have to play with different numberings when switching the IS from A to B. Is it clear enough ???? That?s cristal clear. If the fields are interlaced, that?s actually easier to do, because you preserve the distribution, and there is less data movement. I?ll try to fix your code in the case where the fields are interlaced if now one gives you another answer in the meantime. Thanks, Pierre Regards, Nicolas -- Nicolas Tardieu Ing PhD Computational Mechanics EDF - R&D Dpt ERMES PARIS-SACLAY, FRANCE ________________________________ De : pierre at joliv.et > Envoy? : mercredi 19 octobre 2022 14:51 ? : TARDIEU Nicolas > Cc : petsc-users at mcs.anl.gov > Objet : Re: [petsc-users] Trouble with ISEmbed On two processes, you have a different distribution for u and u+p. IS Object: 2 MPI processes type: general [0] Number of indices in set 5 [0] 0 0 [0] 1 1 [0] 2 2 [0] 3 3 [0] 4 4 [1] Number of indices in set 5 [1] 0 5 [1] 1 6 [1] 2 7 [1] 3 8 [1] 4 9 IS Object: 2 MPI processes type: general [0] Number of indices in set 8 [0] 0 0 [0] 1 1 [0] 2 2 [0] 3 3 [0] 4 4 [0] 5 5 [0] 6 6 [0] 7 7 [1] Number of indices in set 7 [1] 0 8 [1] 1 9 [1] 2 10 [1] 3 11 [1] 4 12 [1] 5 13 [1] 6 14 ISEmbed() only works on local indices, so when you embed u into u+p, on the second process, you miss the row/column indices 5, 6, 7 of B = A(u+p, u+p). Thus, you end up with a matrix of dimension size(u) - 3 = 10 - 3 = 7, with just the row/column indices 8 and 9 being selected by the second process. What is it that you want to do exactly? Play with ISEmbed(), or get A(u, u) without using A but B instead? Thanks, Pierre > On 19 Oct 2022, at 12:00 PM, TARDIEU Nicolas via petsc-users > wrote: > > Dear PETSc Team, > > I am trying to use IS embeding in parallel. > In order to (try to) understand how it works, I have built a simple example, attached to this email. > > I consider a 20X20 matrix. The dof (u, p, t) in global numbering are the following : > u: 0..9 p: 10..14 t: 15..19 > > I have defined 4 IS to describe the dof u, p, t and the agglomeration of u and p, called up. > I first extract the submatrix matrix(up,up), then I would like to extract from it the (u,u) block. > > The example runs OK in sequential but I do not obtain the (u,u) block on 2 processes. > > I have a mistake in the build of the sub-IS but I cannot find it for days. > > Best regards, > Nicolas > -- > Nicolas Tardieu > Ing PhD Computational Mechanics > EDF - R&D Dpt ERMES > PARIS-SACLAY, FRANCE > > Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. > Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. > Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. > ____________________________________________________ > This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. > If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. > E-mail communication cannot be guaranteed to be timely secure, error or virus-free. > Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. ____________________________________________________ This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. E-mail communication cannot be guaranteed to be timely secure, error or virus-free. Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. ____________________________________________________ This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. E-mail communication cannot be guaranteed to be timely secure, error or virus-free. -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Oct 21 06:49:45 2022 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 21 Oct 2022 07:49:45 -0400 Subject: [petsc-users] Periodic boundary condition In-Reply-To: References: Message-ID: On Thu, Oct 20, 2022 at 10:27 PM Barry Smith wrote: > > Some of the valgrind information does not appear to make sense > > PetscMemcpy() is not calling SNESSolve() so I suspect there must be some > serious corruption of something to this impossible stack trace > I ran Valgrind on it, and it looks like it could be a bug in MatFD tau/w=26225.459! initial! copy! Write output at step= 0! ==51148== Invalid write of size 4 ==51148== at 0x105D22252: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:149) ==51148== by 0x105D2402C: MatFDColoringSetUp_SeqXAIJ (fdaij.c:281) ==51148== by 0x10678B2B2: MatFDColoringSetUp (fdmatrix.c:234) ==51148== by 0x108738AE5: SNESComputeJacobianDefaultColor (snesj2.c:95) ==51148== by 0x1086E2F31: SNESComputeJacobian (snes.c:2804) ==51148== by 0x1084A8AA0: SNESSolve_NEWTONLS (ls.c:207) ==51148== by 0x1087097B9: SNESSolve (snes.c:4689) ==51148== by 0x1088A22F7: TSTheta_SNESSolve (theta.c:174) ==51148== by 0x108888568: TSStep_Theta (theta.c:210) ==51148== by 0x1089FD12B: TSStep (ts.c:3445) ==51148== by 0x108A08112: TSSolve (ts.c:3836) ==51148== by 0x100009F83: main (one.c:392) ==51148== Address 0x16905b000 is not stack'd, malloc'd or (recently) free'd When I remove PERIODIC from the DM, this goes away. Thanks, Matt > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > > From > > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private > (fdaij.c:146) > ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > > I suggest you run with -malloc_debug instead of valgrind and see if any > errors are reported. If so you can add the macro CHKMEMQ; inside your > function evaluation where you write to memory to see if anything is writing > to the wrong location. For example wherever you assign aF such as > > aF[j][i].vx=(x3+x4+x5+x6+x7+x8+x9-x1-x2)*user->hx; > > this can help you determine the exact line number where you are writing to > the wrong location and determine what might be the cause. > > > On Oct 20, 2022, at 6:45 PM, Sepideh Kavousi wrote: > > Hello, > I want to solve my 5 PDEs based on finite difference method using > periodic BC in x-direction and non-periodic in y-direction but I run into > error (Segmentation Violation, probably memory access out of range). > For this, I discretize my equation in FormFunction function. My PDE > discretization in (i,j) node needs data on (i+1,j), (i+2,j), (i-1,j), > (i-2,j), (i,j+1), (i,j+2), (i,j-1), (i,j-2) points. > In my previous codes that the x-direction was non-periodic (no flux) > boundary condition, I: > i) implemented the no flux BC for i=0 and i=Nx-1, > ii) set i+2= Nx-1 in discretizing (Nx-2,j) and i+2= 0 > in discretizing (1,j) > iii) discretized my equation for i=1..Nx-2. > I am not sure how I should do the periodic BC. From the following > discussions ( > https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html and > https://lists.mcs.anl.gov/pipermail/petsc-users/2016-May/029273.html), I > guess I should not do step (i) (stated above) for the x-boundaries and just > do step (iii) for i=0..Nx-1. If I just focus on solving 2 of the PDEs which > does need data on (i+2,j), (i-2,j), (i,j+2), (i,j-2) points for > discretizing equation in (i,j) node, I still run into error: > Running with Valgrind (just 1 processor) gave the following file. I did > not find any information which gives me hint on the error source. > Can you please help me to find the error? > Best, > Sepideh > > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x4C29E39: malloc (vg_replace_malloc.c:309) > ==236074== by 0x1B79E59B: MPID_Init (mpid_init.c:1649) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/ > libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/ > libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/ > libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/ > libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/ > libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in > /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in > /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in > /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in > /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in > /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in > /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in > /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1B1DA260: __I_MPI___intel_sse2_strncmp (in > /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) > ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) > ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) > ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) > ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) > ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) > ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) > ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1B1DA383: __I_MPI___intel_sse2_strncmp (in > /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) > ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) > ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) > ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) > ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) > ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) > ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) > ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1E48032E: __intel_sse4_strcpy (in > /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > ==236074== by 0x51FD8BE: PetscStrcpy (str.c:354) > ==236074== by 0x51FD7A3: PetscStrallocpy (str.c:188) > ==236074== by 0x52A39CE: PetscEventRegLogRegister (eventlog.c:313) > ==236074== by 0x527D89A: PetscLogEventRegister (plog.c:693) > ==236074== by 0x6A56A20: PCBDDCInitializePackage (bddc.c:3115) > ==236074== by 0x6E1A515: PCInitializePackage (dlregisksp.c:59) > ==236074== by 0x6DB1A86: PCCreate (precon.c:382) > ==236074== by 0x6E05167: KSPGetPC (itfunc.c:1837) > ==236074== by 0x6E0FC5C: KSPSetDM (iterativ.c:1150) > ==236074== by 0x6FDD27B: SNESSetDM (snes.c:5402) > ==236074== by 0x70B85F7: TSGetSNES (ts.c:2914) > ==236074== by 0x70BE430: TSSetDM (ts.c:4949) > ==236074== by 0x402496: main (one.c:378) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in > /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) > ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) > ==236074== by 0x52C9958: PetscViewerFileSetName (filev.c:659) > ==236074== by 0x52B743B: PetscViewerVTKOpen (vtkv.c:279) > ==236074== by 0x70C76E6: TSMonitorSolutionVTK (ts.c:5580) > ==236074== by 0x40313C: FormFunction (one.c:120) > ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) > ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) > ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) > ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) > ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) > ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in > /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) > ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) > ==236074== by 0x5224E4B: PetscFOpen (mpiuopen.c:52) > ==236074== by 0x63A074B: DMDAVTKWriteAll_VTS.A (grvtk.c:72) > ==236074== by 0x639A589: DMDAVTKWriteAll (grvtk.c:545) > ==236074== by 0x52B66F3: PetscViewerFlush_VTK (vtkv.c:100) > ==236074== by 0x52CFAAE: PetscViewerFlush (flush.c:26) > ==236074== by 0x52CEA95: PetscViewerDestroy (view.c:113) > ==236074== by 0x70C7717: TSMonitorSolutionVTK (ts.c:5582) > ==236074== by 0x40313C: FormFunction (one.c:120) > ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) > ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) > ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) > ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) > ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) > ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private > (fdaij.c:146) > ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > ==236074== Invalid write of size 4 > ==236074== at 0x5F10983: MatFDColoringSetUpBlocked_AIJ_Private > (fdaij.c:150) > ==236074== by 0x5F10983: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== Address 0x3a94fa80 is 0 bytes after a block of size 73,960,000 > alloc'd > ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) > ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) > ==236074== by 0x52305F9: PetscMallocA (mal.c:418) > ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private > (fdaij.c:125) > ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > ==236074== Invalid write of size 8 > ==236074== at 0x5F10991: MatFDColoringSetUpBlocked_AIJ_Private > (fdaij.c:151) > ==236074== by 0x5F10991: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== Address 0x3a94fa88 is 8 bytes after a block of size 73,960,000 > alloc'd > ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) > ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) > ==236074== by 0x52305F9: PetscMallocA (mal.c:418) > ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private > (fdaij.c:125) > ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > > > Sent from Mail for > Windows > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Fri Oct 21 09:52:04 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Fri, 21 Oct 2022 15:52:04 +0100 Subject: [petsc-users] Does flagging a matrix as symmetric improving performances? Message-ID: Hi PETSc friends, As per object, do you think that flagging a matrix as symmetric might improve setup times of the preconditioner? Thank you as always. -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Fri Oct 21 09:54:49 2022 From: bsmith at petsc.dev (Barry Smith) Date: Fri, 21 Oct 2022 10:54:49 -0400 Subject: [petsc-users] [External] Periodic boundary condition In-Reply-To: References: Message-ID: The problem with the output below is it is not giving a clear indication where the crash occurred. #1 User provided function() line 0 in unknown file Run with the exact same options but also -start_in_debugger noxterm It should then crash in the debugger and you can type bt to see the backtrace of where it crashed, send that output. Barry Background: MatFDColoringSetUpBlocked_AIJ_Private() allocates the space that is used when evaluating the function multiple times to get the Jacobian entries. If the FormFunction writes into incorrect locations, then it will corrupt this memory that was allocated in MatFDColoringSetUpBlocked_AIJ_Private() . It does not mean necessarily that there is anything wrong in MatFDColoringSetUpBlocked_AIJ_Private() > On Oct 21, 2022, at 12:32 AM, Sepideh Kavousi wrote: > > Barry, > I ran the code with -malloc_debug and added CHKMEMQ for all the lines inside formfunction. Following is the detail of error. > Best, > Sepideh > > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors > [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run > [0]PETSC ERROR: to get more information on the crash. > [0]PETSC ERROR: PetscMallocValidate: error detected at PetscError() line 401 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/err.c > [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) > [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Signal received > [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 > [0]PETSC ERROR: ./one.out on a skylake named c415-063.stampede2.tacc.utexas.edu by tg863649 Thu Oct 20 23:30:05 2022 > [0]PETSC ERROR: Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" > [0]PETSC ERROR: #1 User provided function() line 0 in unknown file > [0]PETSC ERROR: Checking the memory for corruption. > [0]PETSC ERROR: PetscMallocValidate: error detected at PetscSignalHandlerDefault() line 170 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/signal.c > [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) > [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c > application called MPI_Abort(MPI_COMM_WORLD, 50176059) - process 0 > [unset]: readline failed > > > > > Sent from Mail for Windows > > From: Barry Smith > Sent: Thursday, October 20, 2022 10:27 PM > To: Sepideh Kavousi > Cc: petsc-users at mcs.anl.gov > Subject: [External] Re: [petsc-users] Periodic boundary condition > > > Some of the valgrind information does not appear to make sense > > PetscMemcpy() is not calling SNESSolve() so I suspect there must be some serious corruption of something to this impossible stack trace > > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > > From > > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) > ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > > I suggest you run with -malloc_debug instead of valgrind and see if any errors are reported. If so you can add the macro CHKMEMQ; inside your function evaluation where you write to memory to see if anything is writing to the wrong location. For example wherever you assign aF such as > > aF[j][i].vx=(x3+x4+x5+x6+x7+x8+x9-x1-x2)*user->hx; > > this can help you determine the exact line number where you are writing to the wrong location and determine what might be the cause. > > > > On Oct 20, 2022, at 6:45 PM, Sepideh Kavousi > wrote: > > Hello, > I want to solve my 5 PDEs based on finite difference method using periodic BC in x-direction and non-periodic in y-direction but I run into error (Segmentation Violation, probably memory access out of range). > For this, I discretize my equation in FormFunction function. My PDE discretization in (i,j) node needs data on (i+1,j), (i+2,j), (i-1,j), (i-2,j), (i,j+1), (i,j+2), (i,j-1), (i,j-2) points. > In my previous codes that the x-direction was non-periodic (no flux) boundary condition, I: > i) implemented the no flux BC for i=0 and i=Nx-1, > ii) set i+2= Nx-1 in discretizing (Nx-2,j) and i+2= 0 in discretizing (1,j) > iii) discretized my equation for i=1..Nx-2. > I am not sure how I should do the periodic BC. From the following discussions (https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html andhttps://lists.mcs.anl.gov/pipermail/petsc-users/2016-May/029273.html ), I guess I should not do step (i) (stated above) for the x-boundaries and just do step (iii) for i=0..Nx-1. If I just focus on solving 2 of the PDEs which does need data on (i+2,j), (i-2,j), (i,j+2), (i,j-2) points for discretizing equation in (i,j) node, I still run into error: > Running with Valgrind (just 1 processor) gave the following file. I did not find any information which gives me hint on the error source. > Can you please help me to find the error? > Best, > Sepideh > > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x4C29E39: malloc (vg_replace_malloc.c:309) > ==236074== by 0x1B79E59B: MPID_Init (mpid_init.c:1649) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1B1DA260: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) > ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) > ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) > ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) > ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) > ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) > ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) > ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1B1DA383: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) > ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) > ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) > ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) > ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) > ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) > ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) > ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1E48032E: __intel_sse4_strcpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > ==236074== by 0x51FD8BE: PetscStrcpy (str.c:354) > ==236074== by 0x51FD7A3: PetscStrallocpy (str.c:188) > ==236074== by 0x52A39CE: PetscEventRegLogRegister (eventlog.c:313) > ==236074== by 0x527D89A: PetscLogEventRegister (plog.c:693) > ==236074== by 0x6A56A20: PCBDDCInitializePackage (bddc.c:3115) > ==236074== by 0x6E1A515: PCInitializePackage (dlregisksp.c:59) > ==236074== by 0x6DB1A86: PCCreate (precon.c:382) > ==236074== by 0x6E05167: KSPGetPC (itfunc.c:1837) > ==236074== by 0x6E0FC5C: KSPSetDM (iterativ.c:1150) > ==236074== by 0x6FDD27B: SNESSetDM (snes.c:5402) > ==236074== by 0x70B85F7: TSGetSNES (ts.c:2914) > ==236074== by 0x70BE430: TSSetDM (ts.c:4949) > ==236074== by 0x402496: main (one.c:378) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) > ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) > ==236074== by 0x52C9958: PetscViewerFileSetName (filev.c:659) > ==236074== by 0x52B743B: PetscViewerVTKOpen (vtkv.c:279) > ==236074== by 0x70C76E6: TSMonitorSolutionVTK (ts.c:5580) > ==236074== by 0x40313C: FormFunction (one.c:120) > ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) > ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) > ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) > ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) > ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) > ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) > ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) > ==236074== by 0x5224E4B: PetscFOpen (mpiuopen.c:52) > ==236074== by 0x63A074B: DMDAVTKWriteAll_VTS.A (grvtk.c:72) > ==236074== by 0x639A589: DMDAVTKWriteAll (grvtk.c:545) > ==236074== by 0x52B66F3: PetscViewerFlush_VTK (vtkv.c:100) > ==236074== by 0x52CFAAE: PetscViewerFlush (flush.c:26) > ==236074== by 0x52CEA95: PetscViewerDestroy (view.c:113) > ==236074== by 0x70C7717: TSMonitorSolutionVTK (ts.c:5582) > ==236074== by 0x40313C: FormFunction (one.c:120) > ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) > ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) > ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) > ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) > ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) > ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) > ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > ==236074== Invalid write of size 4 > ==236074== at 0x5F10983: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:150) > ==236074== by 0x5F10983: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== Address 0x3a94fa80 is 0 bytes after a block of size 73,960,000 alloc'd > ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) > ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) > ==236074== by 0x52305F9: PetscMallocA (mal.c:418) > ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) > ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > ==236074== Invalid write of size 8 > ==236074== at 0x5F10991: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:151) > ==236074== by 0x5F10991: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== Address 0x3a94fa88 is 8 bytes after a block of size 73,960,000 alloc'd > ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) > ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) > ==236074== by 0x52305F9: PetscMallocA (mal.c:418) > ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) > ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > > > Sent from Mail for Windows -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Fri Oct 21 10:00:18 2022 From: bsmith at petsc.dev (Barry Smith) Date: Fri, 21 Oct 2022 11:00:18 -0400 Subject: [petsc-users] Does flagging a matrix as symmetric improving performances? In-Reply-To: References: Message-ID: <129499FF-E670-4C2F-A95E-56022A1B493B@petsc.dev> For most solvers, just setting this flag will not, by itself, will not improve the setup time. An exception to this is PCGAMG in the latest release where it will improve the solution time. For some preconditioner situations you can use MATSBAIJ to store the matrix, this can help noticeably because only half of the matrix is stored and computed with. You can select KSP methods that specifically work only for symmetric operators (or symmetric positive definite operators) to get faster solve times but you need to do this explicitly, it is not done automatically. If you know the matrix is symmetric there is likely never a reason to not set the flag. Note also the flag MAT_SYMMETRY_ETERNAL that can be used in conjunction with MAT_SYMMETRIC if you know the matrix will remain symmetric despite changes may to its numerical values. Barry > On Oct 21, 2022, at 10:52 AM, Edoardo alinovi wrote: > > Hi PETSc friends, > > As per object, do you think that flagging a matrix as symmetric might improve setup times of the preconditioner? > > Thank you as always. From edoardo.alinovi at gmail.com Fri Oct 21 10:36:16 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Fri, 21 Oct 2022 16:36:16 +0100 Subject: [petsc-users] Does flagging a matrix as symmetric improving performances? In-Reply-To: <129499FF-E670-4C2F-A95E-56022A1B493B@petsc.dev> References: <129499FF-E670-4C2F-A95E-56022A1B493B@petsc.dev> Message-ID: Hi Berry, Thank you for the hints. Actually, I am using MPIAIJ, does this mean I need to change the matrix format? Most of the time I am using CG + boomerAMG, do you think that using a sym matrix can be any good for the performance? Cheers -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Fri Oct 21 11:13:18 2022 From: bsmith at petsc.dev (Barry Smith) Date: Fri, 21 Oct 2022 12:13:18 -0400 Subject: [petsc-users] Does flagging a matrix as symmetric improving performances? In-Reply-To: References: <129499FF-E670-4C2F-A95E-56022A1B493B@petsc.dev> Message-ID: Set the flag, it cannot harm. I cannot see that hypre boomerAMG has options to indicate matrix symmetry to take advantage of it. Stefano or Mark would know better than I. No, do not change MPIAIJ it offers the most preconditioners. Barry > On Oct 21, 2022, at 11:36 AM, Edoardo alinovi wrote: > > Hi Berry, > > Thank you for the hints. Actually, I am using MPIAIJ, does this mean I need to change the matrix format? > > Most of the time I am using CG + boomerAMG, do you think that using a sym matrix can be any good for the performance? > > Cheers > > From pierre at joliv.et Fri Oct 21 13:25:38 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Fri, 21 Oct 2022 20:25:38 +0200 Subject: [petsc-users] Trouble with ISEmbed In-Reply-To: References: Message-ID: > On 21 Oct 2022, at 10:46 AM, TARDIEU Nicolas wrote: > > Dear Pierre, > > To complete my last post, in fact, the initial code (playing with the LGMap) was correct. It was my test case that was wrong. > Once fixed according to your suggestion, everything turns out to be OK. > I am nevertheless wondering if this IS embeding according to the global numbering should not be a native PTESc's feature ? I?m not sure about what you are asking for here. I believe the most efficient way to deal with your problem is to use a MatNest instead of a MatAIJ. Then, there are several ways to extract submatrices of the global MatNest without having to deal with the embedding. Thanks, Pierre > Thank you again, > Nicolas > -- > Nicolas Tardieu > Ing PhD Computational Mechanics > EDF - R&D Dpt ERMES > PARIS-SACLAY, FRANCE > De : TARDIEU Nicolas > Envoy? : jeudi 20 octobre 2022 11:45 > ? : pierre at joliv.et > Cc : petsc-users at mcs.anl.gov > Objet : RE: [petsc-users] Trouble with ISEmbed > > Dear Pierre, > > You fixed the problem! Thank you warmly for your precious help. > > Best regards, > Nicolas > -- > Nicolas Tardieu > Ing PhD Computational Mechanics > EDF - R&D Dpt ERMES > PARIS-SACLAY, FRANCE > De : pierre at joliv.et > Envoy? : mercredi 19 octobre 2022 22:22 > ? : TARDIEU Nicolas > Cc : petsc-users at mcs.anl.gov > Objet : Re: [petsc-users] Trouble with ISEmbed > > Sorry, I?m not very proficient in petsc4py, and there are a bunch of interfaces missing, e.g., ISShift(), so it may not be optimal, but I hope you?ll understand. > First, you?ll need to regenerate the .bin by uncommenting the proper part of the code. > That is because you were initially generating a 20x20 matrix, with 4 fields per unknown. > That?s 5 unknowns, and so, with two processes, 10 rows per process is not consistent as 10/4 is not an integer ? I don?t know how to force, in petsc4py, the local size to 12 on process #0 and 8 on process #1. > The modified code generates a 16x16 matrices so it remains consistent. > If you then run the first part of the program, you?ll get both B_uu and B_pp from B instead of A, with one, two, or four processes. > Again, that should work for arbitrary number of processes, you just need to be careful that your local dimensions are consistent with the number of fields. > > Thanks, > Pierre > > > >> On 19 Oct 2022, at 5:01 PM, Pierre Jolivet > wrote: >> >> >> >>> On 19 Oct 2022, at 4:32 PM, TARDIEU Nicolas > wrote: >>> >>> Dear Pierre, >>> >>> Thank you very much for your answer. I have the same explanation as you for the code I sent. >>> But what I would like to do is the following : I have the full matrix A with fields u, p and t (which are interlaced in the real application). I want to extract B=A(u+p, u+p). *Then* I would like to extract the (u, u) block from B - let us call it B_uu. >>> In fact, B_uu=A_uu but I really need to do the extraction from B. >>> And I am missing something since I have to play with different numberings when switching the IS from A to B. >>> >>> Is it clear enough ???? >> >> That?s cristal clear. >> If the fields are interlaced, that?s actually easier to do, because you preserve the distribution, and there is less data movement. >> I?ll try to fix your code in the case where the fields are interlaced if now one gives you another answer in the meantime. >> >> Thanks, >> Pierre >> >>> Regards, >>> Nicolas >>> -- >>> Nicolas Tardieu >>> Ing PhD Computational Mechanics >>> EDF - R&D Dpt ERMES >>> PARIS-SACLAY, FRANCE >>> De : pierre at joliv.et > >>> Envoy? : mercredi 19 octobre 2022 14:51 >>> ? : TARDIEU Nicolas > >>> Cc : petsc-users at mcs.anl.gov > >>> Objet : Re: [petsc-users] Trouble with ISEmbed >>> >>> On two processes, you have a different distribution for u and u+p. >>> IS Object: 2 MPI processes >>> type: general >>> [0] Number of indices in set 5 >>> [0] 0 0 >>> [0] 1 1 >>> [0] 2 2 >>> [0] 3 3 >>> [0] 4 4 >>> [1] Number of indices in set 5 >>> [1] 0 5 >>> [1] 1 6 >>> [1] 2 7 >>> [1] 3 8 >>> [1] 4 9 >>> IS Object: 2 MPI processes >>> type: general >>> [0] Number of indices in set 8 >>> [0] 0 0 >>> [0] 1 1 >>> [0] 2 2 >>> [0] 3 3 >>> [0] 4 4 >>> [0] 5 5 >>> [0] 6 6 >>> [0] 7 7 >>> [1] Number of indices in set 7 >>> [1] 0 8 >>> [1] 1 9 >>> [1] 2 10 >>> [1] 3 11 >>> [1] 4 12 >>> [1] 5 13 >>> [1] 6 14 >>> ISEmbed() only works on local indices, so when you embed u into u+p, on the second process, you miss the row/column indices 5, 6, 7 of B = A(u+p, u+p). >>> Thus, you end up with a matrix of dimension size(u) - 3 = 10 - 3 = 7, with just the row/column indices 8 and 9 being selected by the second process. >>> What is it that you want to do exactly? Play with ISEmbed(), or get A(u, u) without using A but B instead? >>> >>> Thanks, >>> Pierre >>> >>> > On 19 Oct 2022, at 12:00 PM, TARDIEU Nicolas via petsc-users > wrote: >>> > >>> > Dear PETSc Team, >>> > >>> > I am trying to use IS embeding in parallel. >>> > In order to (try to) understand how it works, I have built a simple example, attached to this email. >>> > >>> > I consider a 20X20 matrix. The dof (u, p, t) in global numbering are the following : >>> > u: 0..9 p: 10..14 t: 15..19 >>> > >>> > I have defined 4 IS to describe the dof u, p, t and the agglomeration of u and p, called up. >>> > I first extract the submatrix matrix(up,up), then I would like to extract from it the (u,u) block. >>> > >>> > The example runs OK in sequential but I do not obtain the (u,u) block on 2 processes. >>> > >>> > I have a mistake in the build of the sub-IS but I cannot find it for days. >>> > >>> > Best regards, >>> > Nicolas >>> > -- >>> > Nicolas Tardieu >>> > Ing PhD Computational Mechanics >>> > EDF - R&D Dpt ERMES >>> > PARIS-SACLAY, FRANCE >>> > >>> > Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. >>> > Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. >>> > Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. >>> > ____________________________________________________ >>> > This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. >>> > If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. >>> > E-mail communication cannot be guaranteed to be timely secure, error or virus-free. >>> > >>> >>> >>> Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. >>> Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. >>> Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. >>> ____________________________________________________ >>> This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. >>> If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. >>> E-mail communication cannot be guaranteed to be timely secure, error or virus-free. > > > Ce message et toutes les pi?ces jointes (ci-apr?s le 'Message') sont ?tablis ? l'intention exclusive des destinataires et les informations qui y figurent sont strictement confidentielles. Toute utilisation de ce Message non conforme ? sa destination, toute diffusion ou toute publication totale ou partielle, est interdite sauf autorisation expresse. > Si vous n'?tes pas le destinataire de ce Message, il vous est interdit de le copier, de le faire suivre, de le divulguer ou d'en utiliser tout ou partie. Si vous avez re?u ce Message par erreur, merci de le supprimer de votre syst?me, ainsi que toutes ses copies, et de n'en garder aucune trace sur quelque support que ce soit. Nous vous remercions ?galement d'en avertir imm?diatement l'exp?diteur par retour du message. > Il est impossible de garantir que les communications par messagerie ?lectronique arrivent en temps utile, sont s?curis?es ou d?nu?es de toute erreur ou virus. > ____________________________________________________ > This message and any attachments (the 'Message') are intended solely for the addressees. The information contained in this Message is confidential. Any use of information contained in this Message not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except formal approval. > If you are not the addressee, you may not copy, forward, disclose or use any part of it. If you have received this message in error, please delete it and all copies from your system and notify the sender immediately by return message. > E-mail communication cannot be guaranteed to be timely secure, error or virus-free. -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhaog6 at lsec.cc.ac.cn Sat Oct 22 23:48:20 2022 From: zhaog6 at lsec.cc.ac.cn (=?UTF-8?B?6LW15Yia?=) Date: Sun, 23 Oct 2022 12:48:20 +0800 (GMT+08:00) Subject: [petsc-users] Calling SuperLU_MT in PETSc Message-ID: <4a49ce75.1c41a.184032b9ad8.Coremail.zhaog6@lsec.cc.ac.cn> Dear PETSc team, I want to call the multithreading sparse direct solver SuperLU_MT in PETSc, Could I download by "--download-superlu-mt"? Or what is a good way to support calling SuperLU_MT interface in PETSc? Thank you. Best Regards, Gang From zhaog6 at lsec.cc.ac.cn Sun Oct 23 01:58:13 2022 From: zhaog6 at lsec.cc.ac.cn (=?UTF-8?B?6LW15Yia?=) Date: Sun, 23 Oct 2022 14:58:13 +0800 (GMT+08:00) Subject: [petsc-users] An issue of extraction of factorization matrices in sparse direct solver Message-ID: <7cdb54af.1c628.18403a286ea.Coremail.zhaog6@lsec.cc.ac.cn> Dear developers, I have another question. How can I get the L and U matrices and store them in a file when I call SuperLU through PETSc? Thanks. Best Regards, Gang From knepley at gmail.com Sun Oct 23 05:34:47 2022 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 23 Oct 2022 06:34:47 -0400 Subject: [petsc-users] Calling SuperLU_MT in PETSc In-Reply-To: <4a49ce75.1c41a.184032b9ad8.Coremail.zhaog6@lsec.cc.ac.cn> References: <4a49ce75.1c41a.184032b9ad8.Coremail.zhaog6@lsec.cc.ac.cn> Message-ID: On Sun, Oct 23, 2022 at 12:48 AM ?? wrote: > Dear PETSc team, > > I want to call the multithreading sparse direct solver SuperLU_MT in > PETSc, Could I download by "--download-superlu-mt"? Or what is a good way > to support calling SuperLU_MT interface in PETSc? Thank you. Is it a separate package, or only configure arguments to SuperLU? Thanks, Matt > > Best Regards, > Gang -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Oct 23 05:38:27 2022 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 23 Oct 2022 06:38:27 -0400 Subject: [petsc-users] An issue of extraction of factorization matrices in sparse direct solver In-Reply-To: <7cdb54af.1c628.18403a286ea.Coremail.zhaog6@lsec.cc.ac.cn> References: <7cdb54af.1c628.18403a286ea.Coremail.zhaog6@lsec.cc.ac.cn> Message-ID: On Sun, Oct 23, 2022 at 2:58 AM ?? wrote: > Dear developers, > > I have another question. How can I get the L and U matrices and store them > in a file when I call SuperLU through PETSc? Thanks. SuperLU stores these matrices in its own format. If you want to do I/O with them, you would probably have to extract them from the Petsc Mat and call SuperLU I/O functions, if they exist. Thanks, Matt > Best Regards, > Gang -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre at joliv.et Sun Oct 23 05:55:12 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Sun, 23 Oct 2022 12:55:12 +0200 Subject: [petsc-users] Calling SuperLU_MT in PETSc In-Reply-To: References: <4a49ce75.1c41a.184032b9ad8.Coremail.zhaog6@lsec.cc.ac.cn> Message-ID: > On 23 Oct 2022, at 12:34 PM, Matthew Knepley wrote: > > On Sun, Oct 23, 2022 at 12:48 AM ?? > wrote: > Dear PETSc team, > > I want to call the multithreading sparse direct solver SuperLU_MT in PETSc, Could I download by "--download-superlu-mt"? Or what is a good way to support calling SuperLU_MT interface in PETSc? Thank you. > > Is it a separate package, Yes, one would need to add it: https://github.com/xiaoyeli/superlu_mt Gang, it would be a good exercise to make your first contribution to PETSc if you feel like it. You should be able to copy/paste most of what is in config/BuildSystem/config/packages/SuperLU.py into a new file config/BuildSystem/config/packages/SuperLU_MT.py Thanks, Pierre > or only configure arguments to SuperLU? > > Thanks, > > Matt > > > Best Regards, > Gang > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhaog6 at lsec.cc.ac.cn Sun Oct 23 09:31:20 2022 From: zhaog6 at lsec.cc.ac.cn (=?UTF-8?B?6LW15Yia?=) Date: Sun, 23 Oct 2022 22:31:20 +0800 (GMT+08:00) Subject: [petsc-users] Calling SuperLU_MT in PETSc In-Reply-To: References: <4a49ce75.1c41a.184032b9ad8.Coremail.zhaog6@lsec.cc.ac.cn> Message-ID: <3002f83d.1f30.18405415c1f.Coremail.zhaog6@lsec.cc.ac.cn> Thank you, Pierre, I'll try to add it. Best Regards, Gang -----????----- ???:"Pierre Jolivet" ????:2022-10-23 18:55:12 (???) ???: "Matthew Knepley" ??: "??" , petsc-users at mcs.anl.gov ??: Re: [petsc-users] Calling SuperLU_MT in PETSc On 23 Oct 2022, at 12:34 PM, Matthew Knepley wrote: On Sun, Oct 23, 2022 at 12:48 AM ?? wrote: Dear PETSc team, I want to call the multithreading sparse direct solver SuperLU_MT in PETSc, Could I download by "--download-superlu-mt"? Or what is a good way to support calling SuperLU_MT interface in PETSc? Thank you. Is it a separate package, Yes, one would need to add it: https://github.com/xiaoyeli/superlu_mt Gang, it would be a good exercise to make your first contribution to PETSc if you feel like it. You should be able to copy/paste most of what is in config/BuildSystem/config/packages/SuperLU.py into a new file config/BuildSystem/config/packages/SuperLU_MT.py Thanks, Pierre or only configure arguments to SuperLU? Thanks, Matt Best Regards, Gang -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhaog6 at lsec.cc.ac.cn Sun Oct 23 09:31:40 2022 From: zhaog6 at lsec.cc.ac.cn (=?UTF-8?B?6LW15Yia?=) Date: Sun, 23 Oct 2022 22:31:40 +0800 (GMT+08:00) Subject: [petsc-users] Calling SuperLU_MT in PETSc In-Reply-To: References: <4a49ce75.1c41a.184032b9ad8.Coremail.zhaog6@lsec.cc.ac.cn> Message-ID: <12996b03.1f3c.1840541a919.Coremail.zhaog6@lsec.cc.ac.cn> Yes, it's a separate multithreading-based solver package. I noticed that PETSc support SuperLU (sequential package) and SuperLU_DIST through the option "--download-superlu" and "--download-superlu_dist", but SuperLU_MT doesn't seem to exist. I'm gonna try to add it according to Pierre's suggestions and push into PETSc. Thanks, Matthew. Best Regards, Gang -----????----- ???:"Matthew Knepley" ????:2022-10-23 18:34:47 (???) ???: "??" ??: petsc-users at mcs.anl.gov ??: Re: [petsc-users] Calling SuperLU_MT in PETSc On Sun, Oct 23, 2022 at 12:48 AM ?? wrote: Dear PETSc team, I want to call the multithreading sparse direct solver SuperLU_MT in PETSc, Could I download by "--download-superlu-mt"? Or what is a good way to support calling SuperLU_MT interface in PETSc? Thank you. Is it a separate package, or only configure arguments to SuperLU? Thanks, Matt Best Regards, Gang -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Mon Oct 24 10:37:32 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Mon, 24 Oct 2022 16:37:32 +0100 Subject: [petsc-users] Clarification on MatMPIBAIJSetPreallocation (d_nnz and o_nnz) Message-ID: Hello Barry, I am doing some preliminary work to add a coupled solver in my code. I am reading some documentation on MPIBAIJ as the matrix will be composed of blocks of size 3x3 in 2D and 4x4 in 3D for each cell in the domain. I can't quite understand how to set d_nnz and o_nnz. What is their size? Should I provide a number of non-zero entries for each block or should I do it line by line as in MATAIJ? Would you be able to provide me with a silly example? Thank you! -------------- next part -------------- An HTML attachment was scrubbed... URL: From chenju at utexas.edu Mon Oct 24 09:19:34 2022 From: chenju at utexas.edu (Jau-Uei Chen) Date: Mon, 24 Oct 2022 09:19:34 -0500 Subject: [petsc-users] A question about solving a saddle point system with a direct solver Message-ID: To whom it may concern, I am writing to ask about using PETSc with a direct solver to solve a linear system where a single zero-value eigenvalue exists. Currently, I am working on developing a finite-element solver for a linearized incompressible MHD equation. The code is based on an open-source library called MFEM which has its own wrapper for PETSc and is used in my code. From analysis, I already know that the linear system (Ax=b) to be solved is a saddle point system. By using the flags "solver_pc_type svd" and "solver_pc_svd_monitor", I indeed observe it. Here is an example of an output: SVD: condition number 3.271390119581e+18, 1 of 66 singular values are (nearly) zero SVD: smallest singular values: 3.236925932523e-17 3.108788619412e-04 3.840514506502e-04 4.599292003910e-04 4.909419974671e-04 SVD: largest singular values : 4.007319935079e+00 4.027759008411e+00 4.817755760754e+00 4.176127583956e+01 1.058924751347e+02 However, What surprises me is that the numerical solutions are still relatively accurate by comparing to the exact ones (i.e. manufactured solutions) when I perform convergence tests even if I am using a direct solver (i.e. -solver_ksp_type preonly -solver_pc_type lu -solver_pc_factor_mat_solver_type mumps). My question is: Why the direct solver won't break down in this context? I understand that it won't be an issue for iterative solvers such as GMRES [1][2] but not really sure why it won't cause trouble in direct solvers. Any comments or suggestions are greatly appreciated. Best Regards, Jau-Uei Chen Reference: [1] Benzi, Michele, et al. ?Numerical Solution of Saddle Point Problems.? Acta Numerica, vol. 14, May 2005, pp. 1?137. DOI.org (Crossref), https://doi.org/10.1017/S0962492904000212. [2] Elman, Howard C., et al. Finite Elements and Fast Iterative Solvers: With Applications in Incompressible Fluid Dynamics. Second edition, Oxford University Press, 2014. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Mon Oct 24 10:41:54 2022 From: jed at jedbrown.org (Jed Brown) Date: Mon, 24 Oct 2022 09:41:54 -0600 Subject: [petsc-users] Clarification on MatMPIBAIJSetPreallocation (d_nnz and o_nnz) In-Reply-To: References: Message-ID: <87a65ljkjh.fsf@jedbrown.org> I recommend calling this one preallocation function, which will preallocate scalar and block formats. It takes one value per block row, counting in blocks. https://petsc.org/release/docs/manualpages/Mat/MatXAIJSetPreallocation/ Edoardo alinovi writes: > Hello Barry, > > I am doing some preliminary work to add a coupled solver in my code. I am > reading some documentation on MPIBAIJ as the matrix will be composed of > blocks of size 3x3 in 2D and 4x4 in 3D for each cell in the domain. > > I can't quite understand how to set d_nnz and o_nnz. What is their size? > Should I provide a number of non-zero entries for each block or should I do > it line by line as in MATAIJ? > > Would you be able to provide me with a silly example? > > Thank you! From jed at jedbrown.org Mon Oct 24 10:47:04 2022 From: jed at jedbrown.org (Jed Brown) Date: Mon, 24 Oct 2022 09:47:04 -0600 Subject: [petsc-users] A question about solving a saddle point system with a direct solver In-Reply-To: References: Message-ID: <875yg9jkav.fsf@jedbrown.org> You can get lucky with null spaces even with factorization preconditioners, especially if the right hand side is orthogonal to the null space. But it's fragile and you shouldn't rely on that being true as you change the problem. You can either remove the null space in your problem formulation (maybe) or use iterative solvers/fieldsplit preconditioning (which can use a direct solver on nonsingular blocks). Jau-Uei Chen writes: > To whom it may concern, > > I am writing to ask about using PETSc with a direct solver to solve a > linear system where a single zero-value eigenvalue exists. > > Currently, I am working on developing a finite-element solver for a > linearized incompressible MHD equation. The code is based on an open-source > library called MFEM which has its own wrapper for PETSc and is used in my > code. From analysis, I already know that the linear system (Ax=b) to be > solved is a saddle point system. By using the flags "solver_pc_type svd" > and "solver_pc_svd_monitor", I indeed observe it. Here is an example of an > output: > > SVD: condition number 3.271390119581e+18, 1 of 66 singular values are > (nearly) zero > SVD: smallest singular values: 3.236925932523e-17 3.108788619412e-04 > 3.840514506502e-04 4.599292003910e-04 4.909419974671e-04 > SVD: largest singular values : 4.007319935079e+00 4.027759008411e+00 > 4.817755760754e+00 4.176127583956e+01 1.058924751347e+02 > > > However, What surprises me is that the numerical solutions are still > relatively accurate by comparing to the exact ones (i.e. manufactured > solutions) when I perform convergence tests even if I am using a direct > solver (i.e. -solver_ksp_type preonly -solver_pc_type lu > -solver_pc_factor_mat_solver_type > mumps). My question is: Why the direct solver won't break down in this > context? I understand that it won't be an issue for iterative solvers such > as GMRES [1][2] but not really sure why it won't cause trouble in direct > solvers. > > Any comments or suggestions are greatly appreciated. > > Best Regards, > Jau-Uei Chen > > Reference: > [1] Benzi, Michele, et al. ?Numerical Solution of Saddle Point Problems.? > Acta Numerica, vol. 14, May 2005, pp. 1?137. DOI.org (Crossref), > https://doi.org/10.1017/S0962492904000212. > [2] Elman, Howard C., et al. Finite Elements and Fast Iterative Solvers: > With Applications in Incompressible Fluid Dynamics. Second edition, Oxford > University Press, 2014. From edoardo.alinovi at gmail.com Mon Oct 24 10:56:25 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Mon, 24 Oct 2022 16:56:25 +0100 Subject: [petsc-users] Clarification on MatMPIBAIJSetPreallocation (d_nnz and o_nnz) In-Reply-To: <87a65ljkjh.fsf@jedbrown.org> References: <87a65ljkjh.fsf@jedbrown.org> Message-ID: Thank you Jed for the hint. So just to understand it with an example. Say I have this matrix here, which has 4 3x3 blocks 1 2 0 | 0 5 0 | 0 2 3 | 0 0 1 | <---- Proc 1 0 0 1 | 0 2 2 | --------|--------| 1 2 0 | 0 5 0 | 0 2 0 | 0 0 1 | <---- Proc 2 0 0 1 | 0 0 2 | -------|---------| This can be represented as a collection of submatrices like: A B C D A and D are the diagonal blocks, while B and C are the off-diagonal ones. How should I set d_nnz and o_nnz in this case? -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Mon Oct 24 11:02:33 2022 From: jed at jedbrown.org (Jed Brown) Date: Mon, 24 Oct 2022 10:02:33 -0600 Subject: [petsc-users] Clarification on MatMPIBAIJSetPreallocation (d_nnz and o_nnz) In-Reply-To: References: <87a65ljkjh.fsf@jedbrown.org> Message-ID: <8735bdjjl2.fsf@jedbrown.org> This looks like one block row per process? (BAIJ formats store explicit zeros that appear within nonzero blocks.) You'd use d_nnz[] = {1}, o_nnz[] = {1} on each process. If each of the dummy numbers there was replaced by a nonzero block (so the diagram would be sketching nonzero 3x3 blocks of an 18x18 matrix), then you'd have bs=3 with: rank 0: d_nnz[] = {2,2,1}, o_nnz[] = {1,1,2}; rank 1: d_nnz[] = {1,1,1}, o_nnz[] = {2,1,1}; Edoardo alinovi writes: > Thank you Jed for the hint. > > So just to understand it with an example. Say I have this matrix here, > which has 4 3x3 blocks > > 1 2 0 | 0 5 0 | > 0 2 3 | 0 0 1 | <---- Proc 1 > 0 0 1 | 0 2 2 | > --------|--------| > 1 2 0 | 0 5 0 | > 0 2 0 | 0 0 1 | <---- Proc 2 > 0 0 1 | 0 0 2 | > -------|---------| > > This can be represented as a collection of submatrices like: > A B > C D > > A and D are the diagonal blocks, while B and C are the off-diagonal ones. > How should I set d_nnz and o_nnz in this case? From chenju at utexas.edu Mon Oct 24 15:37:01 2022 From: chenju at utexas.edu (Jau-Uei Chen) Date: Mon, 24 Oct 2022 15:37:01 -0500 Subject: [petsc-users] A question about solving a saddle point system with a direct solver In-Reply-To: <875yg9jkav.fsf@jedbrown.org> References: <875yg9jkav.fsf@jedbrown.org> Message-ID: I see. Thanks so much for the comment. On Mon, Oct 24, 2022 at 10:47 AM Jed Brown wrote: > You can get lucky with null spaces even with factorization > preconditioners, especially if the right hand side is orthogonal to the > null space. But it's fragile and you shouldn't rely on that being true as > you change the problem. You can either remove the null space in your > problem formulation (maybe) or use iterative solvers/fieldsplit > preconditioning (which can use a direct solver on nonsingular blocks). > > Jau-Uei Chen writes: > > > To whom it may concern, > > > > I am writing to ask about using PETSc with a direct solver to solve a > > linear system where a single zero-value eigenvalue exists. > > > > Currently, I am working on developing a finite-element solver for a > > linearized incompressible MHD equation. The code is based on an > open-source > > library called MFEM which has its own wrapper for PETSc and is used in my > > code. From analysis, I already know that the linear system (Ax=b) to be > > solved is a saddle point system. By using the flags "solver_pc_type svd" > > and "solver_pc_svd_monitor", I indeed observe it. Here is an example of > an > > output: > > > > SVD: condition number 3.271390119581e+18, 1 of 66 singular values are > > (nearly) zero > > SVD: smallest singular values: 3.236925932523e-17 3.108788619412e-04 > > 3.840514506502e-04 4.599292003910e-04 4.909419974671e-04 > > SVD: largest singular values : 4.007319935079e+00 4.027759008411e+00 > > 4.817755760754e+00 4.176127583956e+01 1.058924751347e+02 > > > > > > However, What surprises me is that the numerical solutions are still > > relatively accurate by comparing to the exact ones (i.e. manufactured > > solutions) when I perform convergence tests even if I am using a direct > > solver (i.e. -solver_ksp_type preonly -solver_pc_type lu > > -solver_pc_factor_mat_solver_type > > mumps). My question is: Why the direct solver won't break down in this > > context? I understand that it won't be an issue for iterative solvers > such > > as GMRES [1][2] but not really sure why it won't cause trouble in direct > > solvers. > > > > Any comments or suggestions are greatly appreciated. > > > > Best Regards, > > Jau-Uei Chen > > > > Reference: > > [1] Benzi, Michele, et al. ?Numerical Solution of Saddle Point Problems.? > > Acta Numerica, vol. 14, May 2005, pp. 1?137. DOI.org (Crossref), > > https://doi.org/10.1017/S0962492904000212. > > [2] Elman, Howard C., et al. Finite Elements and Fast Iterative Solvers: > > With Applications in Incompressible Fluid Dynamics. Second edition, > Oxford > > University Press, 2014. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From xsli at lbl.gov Mon Oct 24 16:47:15 2022 From: xsli at lbl.gov (Xiaoye S. Li) Date: Mon, 24 Oct 2022 14:47:15 -0700 Subject: [petsc-users] An issue of extraction of factorization matrices in sparse direct solver In-Reply-To: References: <7cdb54af.1c628.18403a286ea.Coremail.zhaog6@lsec.cc.ac.cn> Message-ID: There are some utility routines for printing L\U in superlu_dist: SRC/dutil_dist.c 1. You can output the L factor to a file with the triplet format by using https://github.com/xiaoyeli/superlu_dist/blob/324d65fced6ce8abf0eb900223cba0207d538db7/SRC/dutil_dist.c#L675 but use line 755 instead of line 753. 2. You can convert the L factor to CSR or triplet using https://github.com/xiaoyeli/superlu_dist/blob/324d65fced6ce8abf0eb900223cba0207d538db7/SRC/dutil_dist.c#L815 https://github.com/xiaoyeli/superlu_dist/blob/324d65fced6ce8abf0eb900223cba0207d538db7/SRC/dutil_dist.c#L1075 but need to make sure you only use 1 MPI to call superlu_dist 3. You can modify https://github.com/xiaoyeli/superlu_dist/blob/324d65fced6ce8abf0eb900223cba0207d538db7/SRC/dutil_dist.c#L1229 to generate CSR/triplet for the U factor as well. Sherry Li On Sun, Oct 23, 2022 at 3:38 AM Matthew Knepley wrote: > On Sun, Oct 23, 2022 at 2:58 AM ?? wrote: > >> Dear developers, >> >> I have another question. How can I get the L and U matrices and store >> them in a file when I call SuperLU through PETSc? Thanks. > > > SuperLU stores these matrices in its own format. If you want to do I/O > with them, you would probably have to > extract them from the Petsc Mat and call SuperLU I/O functions, if they > exist. > > Thanks, > > Matt > > >> Best Regards, >> Gang > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kavousi at mines.edu Tue Oct 25 17:37:35 2022 From: kavousi at mines.edu (Sepideh Kavousi) Date: Tue, 25 Oct 2022 22:37:35 +0000 Subject: [petsc-users] [External] Periodic boundary condition In-Reply-To: References: Message-ID: Hello Barry, When I ran with , the error is about PetscInitialize line (Line 333). When I write bt multiple times, it just continues referring to this line. #0 0x00002b701cfed9fd in nanosleep () from /lib64/libc.so.6 #1 0x00002b701cfed894 in sleep () from /lib64/libc.so.6 #2 0x00002b70035fb4ae in PetscSleep (s=1) at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/utils/psleep.c:46 #3 0x00002b700364b8bb in PetscAttachDebugger () at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/adebug.c:405 #4 0x00002b700366cfcd in PetscOptionsCheckInitial_Private (help=0x7ffec24c7940 "\t") at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/objects/init.c:608 #5 0x00002b7003674cd6 in PetscInitialize (argc=0x7ffec24c7940, args=0x7ffec24c7940, file=0x0, help=0xffffffffffffffff
) at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/objects/pinit.c:1025 #6 0x00000000004021ce in main (argc=24, argv=0x7ffec24d14e8) at /scratch/07065/tg863649/convection/test-a9-3-options_small_MAC_pressure_old/one.c:333 Best, Sepideh Sent from Mail for Windows From: Barry Smith Sent: Friday, October 21, 2022 10:54 AM To: Sepideh Kavousi Cc: petsc-users at mcs.anl.gov Subject: Re: [External] [petsc-users] Periodic boundary condition The problem with the output below is it is not giving a clear indication where the crash occurred. #1 User provided function() line 0 in unknown file Run with the exact same options but also -start_in_debugger noxterm It should then crash in the debugger and you can type bt to see the backtrace of where it crashed, send that output. Barry Background: MatFDColoringSetUpBlocked_AIJ_Private() allocates the space that is used when evaluating the function multiple times to get the Jacobian entries. If the FormFunction writes into incorrect locations, then it will corrupt this memory that was allocated in MatFDColoringSetUpBlocked_AIJ_Private() . It does not mean necessarily that there is anything wrong in MatFDColoringSetUpBlocked_AIJ_Private() On Oct 21, 2022, at 12:32 AM, Sepideh Kavousi > wrote: Barry, I ran the code with -malloc_debug and added CHKMEMQ for all the lines inside formfunction. Following is the detail of error. Best, Sepideh [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run [0]PETSC ERROR: to get more information on the crash. [0]PETSC ERROR: PetscMallocValidate: error detected at PetscError() line 401 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/err.c [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Signal received [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 [0]PETSC ERROR: ./one.out on a skylake named c415-063.stampede2.tacc.utexas.edu by tg863649 Thu Oct 20 23:30:05 2022 [0]PETSC ERROR: Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" [0]PETSC ERROR: #1 User provided function() line 0 in unknown file [0]PETSC ERROR: Checking the memory for corruption. [0]PETSC ERROR: PetscMallocValidate: error detected at PetscSignalHandlerDefault() line 170 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/signal.c [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c application called MPI_Abort(MPI_COMM_WORLD, 50176059) - process 0 [unset]: readline failed Sent from Mail for Windows From: Barry Smith Sent: Thursday, October 20, 2022 10:27 PM To: Sepideh Kavousi Cc: petsc-users at mcs.anl.gov Subject: [External] Re: [petsc-users] Periodic boundary condition Some of the valgrind information does not appear to make sense PetscMemcpy() is not calling SNESSolve() so I suspect there must be some serious corruption of something to this impossible stack trace ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) From ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) I suggest you run with -malloc_debug instead of valgrind and see if any errors are reported. If so you can add the macro CHKMEMQ; inside your function evaluation where you write to memory to see if anything is writing to the wrong location. For example wherever you assign aF such as aF[j][i].vx=(x3+x4+x5+x6+x7+x8+x9-x1-x2)*user->hx; this can help you determine the exact line number where you are writing to the wrong location and determine what might be the cause. On Oct 20, 2022, at 6:45 PM, Sepideh Kavousi > wrote: Hello, I want to solve my 5 PDEs based on finite difference method using periodic BC in x-direction and non-periodic in y-direction but I run into error (Segmentation Violation, probably memory access out of range). For this, I discretize my equation in FormFunction function. My PDE discretization in (i,j) node needs data on (i+1,j), (i+2,j), (i-1,j), (i-2,j), (i,j+1), (i,j+2), (i,j-1), (i,j-2) points. In my previous codes that the x-direction was non-periodic (no flux) boundary condition, I: i) implemented the no flux BC for i=0 and i=Nx-1, ii) set i+2= Nx-1 in discretizing (Nx-2,j) and i+2= 0 in discretizing (1,j) iii) discretized my equation for i=1..Nx-2. I am not sure how I should do the periodic BC. From the following discussions (https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html andhttps://lists.mcs.anl.gov/pipermail/petsc-users/2016-May/029273.html), I guess I should not do step (i) (stated above) for the x-boundaries and just do step (iii) for i=0..Nx-1. If I just focus on solving 2 of the PDEs which does need data on (i+2,j), (i-2,j), (i,j+2), (i,j-2) points for discretizing equation in (i,j) node, I still run into error: Running with Valgrind (just 1 processor) gave the following file. I did not find any information which gives me hint on the error source. Can you please help me to find the error? Best, Sepideh ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x4C29E39: malloc (vg_replace_malloc.c:309) ==236074== by 0x1B79E59B: MPID_Init (mpid_init.c:1649) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1B1DA260: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1B1DA383: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E48032E: __intel_sse4_strcpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FD8BE: PetscStrcpy (str.c:354) ==236074== by 0x51FD7A3: PetscStrallocpy (str.c:188) ==236074== by 0x52A39CE: PetscEventRegLogRegister (eventlog.c:313) ==236074== by 0x527D89A: PetscLogEventRegister (plog.c:693) ==236074== by 0x6A56A20: PCBDDCInitializePackage (bddc.c:3115) ==236074== by 0x6E1A515: PCInitializePackage (dlregisksp.c:59) ==236074== by 0x6DB1A86: PCCreate (precon.c:382) ==236074== by 0x6E05167: KSPGetPC (itfunc.c:1837) ==236074== by 0x6E0FC5C: KSPSetDM (iterativ.c:1150) ==236074== by 0x6FDD27B: SNESSetDM (snes.c:5402) ==236074== by 0x70B85F7: TSGetSNES (ts.c:2914) ==236074== by 0x70BE430: TSSetDM (ts.c:4949) ==236074== by 0x402496: main (one.c:378) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) ==236074== by 0x52C9958: PetscViewerFileSetName (filev.c:659) ==236074== by 0x52B743B: PetscViewerVTKOpen (vtkv.c:279) ==236074== by 0x70C76E6: TSMonitorSolutionVTK (ts.c:5580) ==236074== by 0x40313C: FormFunction (one.c:120) ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) ==236074== by 0x5224E4B: PetscFOpen (mpiuopen.c:52) ==236074== by 0x63A074B: DMDAVTKWriteAll_VTS.A (grvtk.c:72) ==236074== by 0x639A589: DMDAVTKWriteAll (grvtk.c:545) ==236074== by 0x52B66F3: PetscViewerFlush_VTK (vtkv.c:100) ==236074== by 0x52CFAAE: PetscViewerFlush (flush.c:26) ==236074== by 0x52CEA95: PetscViewerDestroy (view.c:113) ==236074== by 0x70C7717: TSMonitorSolutionVTK (ts.c:5582) ==236074== by 0x40313C: FormFunction (one.c:120) ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Invalid write of size 4 ==236074== at 0x5F10983: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:150) ==236074== by 0x5F10983: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Address 0x3a94fa80 is 0 bytes after a block of size 73,960,000 alloc'd ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) ==236074== by 0x52305F9: PetscMallocA (mal.c:418) ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Invalid write of size 8 ==236074== at 0x5F10991: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:151) ==236074== by 0x5F10991: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Address 0x3a94fa88 is 8 bytes after a block of size 73,960,000 alloc'd ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) ==236074== by 0x52305F9: PetscMallocA (mal.c:418) ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Sent from Mail for Windows -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Tue Oct 25 18:24:29 2022 From: bsmith at petsc.dev (Barry Smith) Date: Tue, 25 Oct 2022 19:24:29 -0400 Subject: [petsc-users] [External] Periodic boundary condition In-Reply-To: References: Message-ID: Sorry I was not clear, at this point you need to type c for continue and then when it crashes in the debugger type bt Barry > On Oct 25, 2022, at 6:37 PM, Sepideh Kavousi wrote: > > Hello Barry, > When I ran with , the error is about PetscInitialize line (Line 333). When I write bt multiple times, it just continues referring to this line. > > #0 0x00002b701cfed9fd in nanosleep () from /lib64/libc.so.6 > #1 0x00002b701cfed894 in sleep () from /lib64/libc.so.6 > #2 0x00002b70035fb4ae in PetscSleep (s=1) at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/utils/psleep.c:46 > #3 0x00002b700364b8bb in PetscAttachDebugger () at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/adebug.c:405 > #4 0x00002b700366cfcd in PetscOptionsCheckInitial_Private (help=0x7ffec24c7940 "\t") at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/objects/init.c:608 > #5 0x00002b7003674cd6 in PetscInitialize (argc=0x7ffec24c7940, args=0x7ffec24c7940, file=0x0, help=0xffffffffffffffff
) > at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/objects/pinit.c:1025 > #6 0x00000000004021ce in main (argc=24, argv=0x7ffec24d14e8) at /scratch/07065/tg863649/convection/test-a9-3-options_small_MAC_pressure_old/one.c:333 > > Best, > Sepideh > Sent from Mail for Windows > > From: Barry Smith > Sent: Friday, October 21, 2022 10:54 AM > To: Sepideh Kavousi > Cc: petsc-users at mcs.anl.gov > Subject: Re: [External] [petsc-users] Periodic boundary condition > > > The problem with the output below is it is not giving a clear indication where the crash occurred. > > #1 User provided function() line 0 in unknown file > > > Run with the exact same options but also -start_in_debugger noxterm It should then crash in the debugger and you can type bt to see the backtrace of where it crashed, send that output. > > Barry > > Background: MatFDColoringSetUpBlocked_AIJ_Private() allocates the space that is used when evaluating the function multiple times to get the Jacobian entries. If the FormFunction writes into incorrect locations, then it will corrupt this memory that was allocated in MatFDColoringSetUpBlocked_AIJ_Private() . It does not mean necessarily that there is anything wrong in MatFDColoringSetUpBlocked_AIJ_Private() > > > On Oct 21, 2022, at 12:32 AM, Sepideh Kavousi > wrote: > > Barry, > I ran the code with -malloc_debug and added CHKMEMQ for all the lines inside formfunction. Following is the detail of error. > Best, > Sepideh > > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors > [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run > [0]PETSC ERROR: to get more information on the crash. > [0]PETSC ERROR: PetscMallocValidate: error detected at PetscError() line 401 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/err.c > [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) > [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Signal received > [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 > [0]PETSC ERROR: ./one.out on a skylake named c415-063.stampede2.tacc.utexas.edu by tg863649 Thu Oct 20 23:30:05 2022 > [0]PETSC ERROR: Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" > [0]PETSC ERROR: #1 User provided function() line 0 in unknown file > [0]PETSC ERROR: Checking the memory for corruption. > [0]PETSC ERROR: PetscMallocValidate: error detected at PetscSignalHandlerDefault() line 170 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/signal.c > [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) > [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c > application called MPI_Abort(MPI_COMM_WORLD, 50176059) - process 0 > [unset]: readline failed > > > > > Sent from Mail for Windows > > From: Barry Smith > Sent: Thursday, October 20, 2022 10:27 PM > To: Sepideh Kavousi > Cc: petsc-users at mcs.anl.gov > Subject: [External] Re: [petsc-users] Periodic boundary condition > > > Some of the valgrind information does not appear to make sense > > PetscMemcpy() is not calling SNESSolve() so I suspect there must be some serious corruption of something to this impossible stack trace > > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > > From > > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) > ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > > I suggest you run with -malloc_debug instead of valgrind and see if any errors are reported. If so you can add the macro CHKMEMQ; inside your function evaluation where you write to memory to see if anything is writing to the wrong location. For example wherever you assign aF such as > > aF[j][i].vx=(x3+x4+x5+x6+x7+x8+x9-x1-x2)*user->hx; > > this can help you determine the exact line number where you are writing to the wrong location and determine what might be the cause. > > > > > On Oct 20, 2022, at 6:45 PM, Sepideh Kavousi > wrote: > > Hello, > I want to solve my 5 PDEs based on finite difference method using periodic BC in x-direction and non-periodic in y-direction but I run into error (Segmentation Violation, probably memory access out of range). > For this, I discretize my equation in FormFunction function. My PDE discretization in (i,j) node needs data on (i+1,j), (i+2,j), (i-1,j), (i-2,j), (i,j+1), (i,j+2), (i,j-1), (i,j-2) points. > In my previous codes that the x-direction was non-periodic (no flux) boundary condition, I: > i) implemented the no flux BC for i=0 and i=Nx-1, > ii) set i+2= Nx-1 in discretizing (Nx-2,j) and i+2= 0 in discretizing (1,j) > iii) discretized my equation for i=1..Nx-2. > I am not sure how I should do the periodic BC. From the following discussions (https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html andhttps://lists.mcs.anl.gov/pipermail/petsc-users/2016-May/029273.html ), I guess I should not do step (i) (stated above) for the x-boundaries and just do step (iii) for i=0..Nx-1. If I just focus on solving 2 of the PDEs which does need data on (i+2,j), (i-2,j), (i,j+2), (i,j-2) points for discretizing equation in (i,j) node, I still run into error: > Running with Valgrind (just 1 processor) gave the following file. I did not find any information which gives me hint on the error source. > Can you please help me to find the error? > Best, > Sepideh > > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x4C29E39: malloc (vg_replace_malloc.c:309) > ==236074== by 0x1B79E59B: MPID_Init (mpid_init.c:1649) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1B1DA260: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) > ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) > ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) > ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) > ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) > ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) > ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) > ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1B1DA383: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) > ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) > ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) > ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) > ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) > ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) > ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) > ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1E48032E: __intel_sse4_strcpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > ==236074== by 0x51FD8BE: PetscStrcpy (str.c:354) > ==236074== by 0x51FD7A3: PetscStrallocpy (str.c:188) > ==236074== by 0x52A39CE: PetscEventRegLogRegister (eventlog.c:313) > ==236074== by 0x527D89A: PetscLogEventRegister (plog.c:693) > ==236074== by 0x6A56A20: PCBDDCInitializePackage (bddc.c:3115) > ==236074== by 0x6E1A515: PCInitializePackage (dlregisksp.c:59) > ==236074== by 0x6DB1A86: PCCreate (precon.c:382) > ==236074== by 0x6E05167: KSPGetPC (itfunc.c:1837) > ==236074== by 0x6E0FC5C: KSPSetDM (iterativ.c:1150) > ==236074== by 0x6FDD27B: SNESSetDM (snes.c:5402) > ==236074== by 0x70B85F7: TSGetSNES (ts.c:2914) > ==236074== by 0x70BE430: TSSetDM (ts.c:4949) > ==236074== by 0x402496: main (one.c:378) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) > ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) > ==236074== by 0x52C9958: PetscViewerFileSetName (filev.c:659) > ==236074== by 0x52B743B: PetscViewerVTKOpen (vtkv.c:279) > ==236074== by 0x70C76E6: TSMonitorSolutionVTK (ts.c:5580) > ==236074== by 0x40313C: FormFunction (one.c:120) > ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) > ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) > ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) > ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) > ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) > ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) > ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) > ==236074== by 0x5224E4B: PetscFOpen (mpiuopen.c:52) > ==236074== by 0x63A074B: DMDAVTKWriteAll_VTS.A (grvtk.c:72) > ==236074== by 0x639A589: DMDAVTKWriteAll (grvtk.c:545) > ==236074== by 0x52B66F3: PetscViewerFlush_VTK (vtkv.c:100) > ==236074== by 0x52CFAAE: PetscViewerFlush (flush.c:26) > ==236074== by 0x52CEA95: PetscViewerDestroy (view.c:113) > ==236074== by 0x70C7717: TSMonitorSolutionVTK (ts.c:5582) > ==236074== by 0x40313C: FormFunction (one.c:120) > ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) > ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) > ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) > ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) > ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) > ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) > ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > ==236074== Invalid write of size 4 > ==236074== at 0x5F10983: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:150) > ==236074== by 0x5F10983: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== Address 0x3a94fa80 is 0 bytes after a block of size 73,960,000 alloc'd > ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) > ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) > ==236074== by 0x52305F9: PetscMallocA (mal.c:418) > ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) > ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > ==236074== Invalid write of size 8 > ==236074== at 0x5F10991: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:151) > ==236074== by 0x5F10991: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== Address 0x3a94fa88 is 8 bytes after a block of size 73,960,000 alloc'd > ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) > ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) > ==236074== by 0x52305F9: PetscMallocA (mal.c:418) > ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) > ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > > > Sent from Mail for Windows -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed Oct 26 13:26:40 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 26 Oct 2022 13:26:40 -0500 (CDT) Subject: [petsc-users] petsc-3.18.1 now available Message-ID: <0f7d32ff-61bb-e988-5f58-cf626040078a@mcs.anl.gov> Dear PETSc users, The patch release petsc-3.18.1 is now available for download. https://petsc.org/release/install/download/ Satish From tlanyan at hotmail.com Wed Oct 26 21:59:42 2022 From: tlanyan at hotmail.com (=?gb2312?B?z/635SC6zg==?=) Date: Thu, 27 Oct 2022 02:59:42 +0000 Subject: [petsc-users] Questions about matrix permutation Message-ID: Dear developers, To avoid zero pivot error in IC/ILU preconditioners, I plan to permute matrix before solving the linear system. I referred example ex10(https://petsc.org/main/src/ksp/ksp/tutorials/ex10.c.html) and ex18(https://petsc.org/main/src/ksp/ksp/tutorials/ex18.c.html), and was confused that why vector b is permuted by column permutation index set: // in ex18.c 174: if (permute) { 175: Mat Aperm; 176: MatGetOrdering(A, ordering, &rowperm, &colperm); 177: MatPermute(A, rowperm, colperm, &Aperm); 178: VecPermute(b, colperm, PETSC_FALSE); 179: MatDestroy(&A); 180: A = Aperm; /* Replace original operator with permuted version */ 181: } As far as I understand, vector b should be only affected by permuting rows in matrix A, instead of column permutation. How to understand that permuting vector b with column permutation index set of matrix A? Another question is why MatReorderForNonzeroDiagonal function only works for type MATSEQAIJ? If it works for other matrix types, then this function will meet my requirement. Thanks, Xiaofeng -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Thu Oct 27 08:55:41 2022 From: mfadams at lbl.gov (Mark Adams) Date: Thu, 27 Oct 2022 09:55:41 -0400 Subject: [petsc-users] Questions about matrix permutation In-Reply-To: References: Message-ID: On Wed, Oct 26, 2022 at 11:00 PM ?? ? wrote: > Dear developers, > > To avoid zero pivot error in IC/ILU preconditioners, I plan to permute > matrix before solving the linear system. I referred example ex10( > https://petsc.org/main/src/ksp/ksp/tutorials/ex10.c.html) and ex18( > https://petsc.org/main/src/ksp/ksp/tutorials/ex18.c.html), and was > confused that why vector b is permuted by column permutation index set: > > // in ex18.c > > 174: if (permute) {175: Mat Aperm;176: MatGetOrdering (A, ordering, &rowperm, &colperm);177: MatPermute (A, rowperm, colperm, &Aperm);178: VecPermute (b, colperm, PETSC_FALSE );179: MatDestroy (&A);180: A = Aperm; /* Replace original operator with permuted version */181: } > > As far as I understand, vector b should be only affected by permuting > rows in matrix A, instead of column permutation. How to understand that > permuting vector b with column permutation index set of matrix A? > > I get confused by this also. I just test it. Try one and if thats wrong, its the other! As I recall a permutation vector has the place where the equation sould _go_ Another question is why MatReorderForNonzeroDiagonal function only works > for type MATSEQAIJ? If it works for other matrix types, then this function > will meet my requirement. > > It has not been implemented. You could do this, clone the MATSEQAIJ one for the matrix type that you want. See https://petsc.org/release/developers/contributing/ for instructions on contributing to PETSc. Thanks, Mark > > Thanks, > > Xiaofeng > -------------- next part -------------- An HTML attachment was scrubbed... URL: From yann.jobic at univ-amu.fr Thu Oct 27 10:32:44 2022 From: yann.jobic at univ-amu.fr (Yann Jobic) Date: Thu, 27 Oct 2022 17:32:44 +0200 Subject: [petsc-users] basis/basisDer/Jac in FE Message-ID: Hello, I'm trying to understand how to use the lowlevel Petsc Finit Element Framework. I've got few questions about it. 1) I'm checking the determinant of the Jacobian transformation from real space to the parametric one. The source code is in : https://petsc.org/main/src/dm/impls/plex/plexgeometry.c.html#DMPlexComputeTriangleGeometry_Internal I don't quite understand the 0.5 factor in front of it. Geometrically, it should be the ratio of the area/volume of the reel element over the referenced one. It's not the case here no ? It seems that, for other elements, this factor still applies. What this factor represent ? 2) I only checked triangle elements here. The ordering of the test functions and its derivatives is different it seems, with also the 0.5 factor. Am i mistaken ? I could not find any informations about it in the user-doc. Maybe it's written somewhere else ? Many Thanks, Best regards, Yann From matteo.semplice at uninsubria.it Thu Oct 27 10:57:20 2022 From: matteo.semplice at uninsubria.it (Semplice Matteo) Date: Thu, 27 Oct 2022 15:57:20 +0000 Subject: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh Message-ID: Dear Petsc developers, I am trying to use a DMSwarm to locate a cloud of points with respect to a background mesh. In the real application the points will be loaded from disk, but I have created a small demo in which * each processor creates Npart particles, all within the domain covered by the mesh, but not all in the local portion of the mesh * migrate the particles After migration most particles are not any more in the DMSwarm (how many and which ones seems to depend on the number of cpus, but it never happens that all particle survive the migration process). I am clearly missing some step, since I'd expect that a DMDA would be able to locate particles without the need to go through a DMShell as it is done in src/dm/tutorials/swarm_ex3.c.html I attach my demo code. Could someone give me a hint? Best Matteo -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: dmSwarmDemo.cpp Type: text/x-c++src Size: 4868 bytes Desc: dmSwarmDemo.cpp URL: From zhaog6 at lsec.cc.ac.cn Thu Oct 27 10:59:54 2022 From: zhaog6 at lsec.cc.ac.cn (=?UTF-8?B?6LW15Yia?=) Date: Thu, 27 Oct 2022 23:59:54 +0800 (GMT+08:00) Subject: [petsc-users] An issue of extraction of factorization matrices in sparse direct solver In-Reply-To: References: <7cdb54af.1c628.18403a286ea.Coremail.zhaog6@lsec.cc.ac.cn> Message-ID: <46677001.5d8e.1841a2be408.Coremail.zhaog6@lsec.cc.ac.cn> Thank you Sherry for the triangular matrices extraction ways, it's very useful, I'll try to do this in SuperLU_DIST. In addition, Matthew, is it possible for PETSc to write interfaces that support printing `L` and `U` for SuperLU_DIST? Because it is very useful in some industrial application scenarios (store `L` and `U` in advance). Best Regards, Gang -----????----- ???:"Xiaoye S. Li" ????:2022-10-25 05:47:15 (???) ???: "Matthew Knepley" ??: "??" , petsc-users at mcs.anl.gov ??: Re: [petsc-users] An issue of extraction of factorization matrices in sparse direct solver There are some utility routines for printing L\U in superlu_dist: SRC/dutil_dist.c You can output the L factor to a file with the triplet format by using https://github.com/xiaoyeli/superlu_dist/blob/324d65fced6ce8abf0eb900223cba0207d538db7/SRC/dutil_dist.c#L675 but use line 755 instead of line 753. You can convert the L factor to CSR or triplet using https://github.com/xiaoyeli/superlu_dist/blob/324d65fced6ce8abf0eb900223cba0207d538db7/SRC/dutil_dist.c#L815 https://github.com/xiaoyeli/superlu_dist/blob/324d65fced6ce8abf0eb900223cba0207d538db7/SRC/dutil_dist.c#L1075 but need to make sure you only use 1 MPI to call superlu_dist You can modify https://github.com/xiaoyeli/superlu_dist/blob/324d65fced6ce8abf0eb900223cba0207d538db7/SRC/dutil_dist.c#L1229 to generate CSR/triplet for the U factor as well. Sherry Li On Sun, Oct 23, 2022 at 3:38 AM Matthew Knepley wrote: On Sun, Oct 23, 2022 at 2:58 AM ?? wrote: Dear developers, I have another question. How can I get the L and U matrices and store them in a file when I call SuperLU through PETSc? Thanks. SuperLU stores these matrices in its own format. If you want to do I/O with them, you would probably have to extract them from the Petsc Mat and call SuperLU I/O functions, if they exist. Thanks, Matt Best Regards, Gang -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Thu Oct 27 11:23:17 2022 From: mfadams at lbl.gov (Mark Adams) Date: Thu, 27 Oct 2022 12:23:17 -0400 Subject: [petsc-users] basis/basisDer/Jac in FE In-Reply-To: References: Message-ID: Area of a unit triangle is (1x1)/2 I would guess. On Thu, Oct 27, 2022 at 11:32 AM Yann Jobic wrote: > Hello, > > I'm trying to understand how to use the lowlevel Petsc Finit Element > Framework. I've got few questions about it. > > 1) I'm checking the determinant of the Jacobian transformation from real > space to the parametric one. The source code is in : > > https://petsc.org/main/src/dm/impls/plex/plexgeometry.c.html#DMPlexComputeTriangleGeometry_Internal > I don't quite understand the 0.5 factor in front of it. > Geometrically, it should be the ratio of the area/volume of the reel > element over the referenced one. It's not the case here no ? > It seems that, for other elements, this factor still applies. > What this factor represent ? > > 2) I only checked triangle elements here. The ordering of the test > functions and its derivatives is different it seems, with also the 0.5 > factor. Am i mistaken ? > > I could not find any informations about it in the user-doc. Maybe it's > written somewhere else ? > > Many Thanks, > > Best regards, > > Yann > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Thu Oct 27 11:35:28 2022 From: mfadams at lbl.gov (Mark Adams) Date: Thu, 27 Oct 2022 12:35:28 -0400 Subject: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh In-Reply-To: References: Message-ID: I have the same problem and it is being worked on. Joe: Lets add Matteo to our thread on this so we can all test it when you have something. Thanks, Mark On Thu, Oct 27, 2022 at 11:57 AM Semplice Matteo < matteo.semplice at uninsubria.it> wrote: > Dear Petsc developers, > I am trying to use a DMSwarm to locate a cloud of points with respect > to a background mesh. In the real application the points will be loaded > from disk, but I have created a small demo in which > > - each processor creates Npart particles, all within the domain > covered by the mesh, but not all in the local portion of the mesh > - migrate the particles > > After migration most particles are not any more in the DMSwarm (how many > and which ones seems to depend on the number of cpus, but it never happens > that all particle survive the migration process). > > I am clearly missing some step, since I'd expect that a DMDA would be able > to locate particles without the need to go through a DMShell as it is done > in src/dm/tutorials/swarm_ex3.c.html > > > I attach my demo code. > > Could someone give me a hint? > > Best > Matteo > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From yann.jobic at univ-amu.fr Thu Oct 27 12:20:37 2022 From: yann.jobic at univ-amu.fr (Yann Jobic) Date: Thu, 27 Oct 2022 19:20:37 +0200 Subject: [petsc-users] basis/basisDer/Jac in FE In-Reply-To: References: Message-ID: <91344228-7142-9654-b9b8-268043eb9a30@univ-amu.fr> There is this factor also in front of the rectangle, which should 1/4 for a unit rectangle of dim 2, but it's also 0.5. Thanks, Yann Le 10/27/2022 ? 6:23 PM, Mark Adams a ?crit?: > Area of a unit?triangle?is (1x1)/2 I would guess. > > On Thu, Oct 27, 2022 at 11:32 AM Yann Jobic > wrote: > > Hello, > > I'm trying to understand how to use the lowlevel Petsc Finit Element > Framework. I've got few questions about it. > > 1) I'm checking the determinant of the Jacobian transformation from > real > space to the parametric one. The source code is in : > https://petsc.org/main/src/dm/impls/plex/plexgeometry.c.html#DMPlexComputeTriangleGeometry_Internal > I don't quite understand the 0.5 factor in front of it. > Geometrically, it should be the ratio of the area/volume of the reel > element over the referenced one. It's not the case here no ? > It seems that, for other elements, this factor still applies. > What this factor represent ? > > 2) I only checked triangle elements here. The ordering of the test > functions and its derivatives is different it seems, with also the 0.5 > factor. Am i mistaken ? > > I could not find any informations about it in the user-doc. Maybe it's > written somewhere else ? > > Many Thanks, > > Best regards, > > Yann > From knepley at gmail.com Thu Oct 27 12:46:54 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 27 Oct 2022 13:46:54 -0400 Subject: [petsc-users] basis/basisDer/Jac in FE In-Reply-To: <91344228-7142-9654-b9b8-268043eb9a30@univ-amu.fr> References: <91344228-7142-9654-b9b8-268043eb9a30@univ-amu.fr> Message-ID: On Thu, Oct 27, 2022 at 1:20 PM Yann Jobic wrote: > There is this factor also in front of the rectangle, which should 1/4 > for a unit rectangle of dim 2, but it's also 0.5. > Hi Yann, The reference element in Petsc is [-1, -1] to [1, 1]. This matches FEniCS/Firedrake and also makes it easier to use orthogonal polynomials. Thanks, Matt > Thanks, > Yann > > > Le 10/27/2022 ? 6:23 PM, Mark Adams a ?crit : > > Area of a unit triangle is (1x1)/2 I would guess. > > > > On Thu, Oct 27, 2022 at 11:32 AM Yann Jobic > > wrote: > > > > Hello, > > > > I'm trying to understand how to use the lowlevel Petsc Finit Element > > Framework. I've got few questions about it. > > > > 1) I'm checking the determinant of the Jacobian transformation from > > real > > space to the parametric one. The source code is in : > > > https://petsc.org/main/src/dm/impls/plex/plexgeometry.c.html#DMPlexComputeTriangleGeometry_Internal > < > https://petsc.org/main/src/dm/impls/plex/plexgeometry.c.html#DMPlexComputeTriangleGeometry_Internal > > > > I don't quite understand the 0.5 factor in front of it. > > Geometrically, it should be the ratio of the area/volume of the reel > > element over the referenced one. It's not the case here no ? > > It seems that, for other elements, this factor still applies. > > What this factor represent ? > > > > 2) I only checked triangle elements here. The ordering of the test > > functions and its derivatives is different it seems, with also the > 0.5 > > factor. Am i mistaken ? > > > > I could not find any informations about it in the user-doc. Maybe > it's > > written somewhere else ? > > > > Many Thanks, > > > > Best regards, > > > > Yann > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Oct 27 12:50:17 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 27 Oct 2022 13:50:17 -0400 Subject: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh In-Reply-To: References: Message-ID: On Thu, Oct 27, 2022 at 11:57 AM Semplice Matteo < matteo.semplice at uninsubria.it> wrote: > Dear Petsc developers, > I am trying to use a DMSwarm to locate a cloud of points with respect > to a background mesh. In the real application the points will be loaded > from disk, but I have created a small demo in which > > - each processor creates Npart particles, all within the domain > covered by the mesh, but not all in the local portion of the mesh > - migrate the particles > > After migration most particles are not any more in the DMSwarm (how many > and which ones seems to depend on the number of cpus, but it never happens > that all particle survive the migration process). > > I am clearly missing some step, since I'd expect that a DMDA would be able > to locate particles without the need to go through a DMShell as it is done > in src/dm/tutorials/swarm_ex3.c.html > > > I attach my demo code. > > Could someone give me a hint? > I will look at the demo. It should work. There are some tests of this, like SNES ex63, but they use Plex instead of DMDA. Thanks, Matt > Best > Matteo > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Oct 27 12:55:01 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 27 Oct 2022 13:55:01 -0400 Subject: [petsc-users] An issue of extraction of factorization matrices in sparse direct solver In-Reply-To: <46677001.5d8e.1841a2be408.Coremail.zhaog6@lsec.cc.ac.cn> References: <7cdb54af.1c628.18403a286ea.Coremail.zhaog6@lsec.cc.ac.cn> <46677001.5d8e.1841a2be408.Coremail.zhaog6@lsec.cc.ac.cn> Message-ID: On Thu, Oct 27, 2022 at 12:00 PM ?? wrote: > Thank you Sherry for the triangular matrices extraction ways, it's very > useful, I'll try to do this in SuperLU_DIST. > > In addition, Matthew, is it possible for PETSc to write interfaces that > support printing `L` and `U` for SuperLU_DIST? Because it is very useful in > some industrial application scenarios (store `L` and `U` in advance). > I think the way to do it would be to add support for this as the ASCII Mat Viewer for that type. It would be a good contribution I think. Thanks, Matt > Best Regards, > > Gang > > > -----????----- > *???:*"Xiaoye S. Li" > *????:*2022-10-25 05:47:15 (???) > *???:* "Matthew Knepley" > *??:* "??" , petsc-users at mcs.anl.gov > *??:* Re: [petsc-users] An issue of extraction of factorization matrices > in sparse direct solver > > There are some utility routines for printing L\U in superlu_dist: > SRC/dutil_dist.c > > 1. You can output the L factor to a file with the triplet format by > using > > https://github.com/xiaoyeli/superlu_dist/blob/324d65fced6ce8abf0eb900223cba0207d538db7/SRC/dutil_dist.c#L675 > but use line 755 instead of line 753. > 2. You can convert the L factor to CSR or triplet using > https://github.com/xiaoyeli/superlu_dist/blob/324d65fced6ce8abf0eb900223cba0207d538db7/SRC/dutil_dist.c#L815 > > https://github.com/xiaoyeli/superlu_dist/blob/324d65fced6ce8abf0eb900223cba0207d538db7/SRC/dutil_dist.c#L1075 > but need to make sure you only use 1 MPI to call superlu_dist > 3. You can modify > > https://github.com/xiaoyeli/superlu_dist/blob/324d65fced6ce8abf0eb900223cba0207d538db7/SRC/dutil_dist.c#L1229 > to generate CSR/triplet for the U factor as well. > > > Sherry Li > > On Sun, Oct 23, 2022 at 3:38 AM Matthew Knepley wrote: > >> On Sun, Oct 23, 2022 at 2:58 AM ?? wrote: >> >>> Dear developers, >>> >>> I have another question. How can I get the L and U matrices and store >>> them in a file when I call SuperLU through PETSc? Thanks. >> >> >> SuperLU stores these matrices in its own format. If you want to do I/O >> with them, you would probably have to >> extract them from the Petsc Mat and call SuperLU I/O functions, if they >> exist. >> >> Thanks, >> >> Matt >> >> >>> Best Regards, >>> Gang >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From stephan.koehler at math.tu-freiberg.de Fri Oct 28 03:27:47 2022 From: stephan.koehler at math.tu-freiberg.de (=?UTF-8?Q?Stephan_K=c3=b6hler?=) Date: Fri, 28 Oct 2022 10:27:47 +0200 Subject: [petsc-users] Report Bug TaoALMM class Message-ID: <4eec06f9-d534-7a02-9abe-6d1415f663f0@math.tu-freiberg.de> Dear PETSc/Tao team, it seems to be that there is a bug in the TaoALMM class: In the methods TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private the vector where the function value for the augmented Lagrangian is evaluate is copied into the current solution, see, e.g., https://petsc.org/release/src/tao/constrained/impls/almm/almm.c.html line 672 or 682.? This causes subsolver routine to not converge if the line search for the subsolver rejects the step length 1. for some update.? In detail: Suppose the current iterate is xk and the current update is dxk. The line search evaluates the augmented Lagrangian now at (xk + dxk).? This causes that the value (xk + dxk) is copied in the current solution.? If the point (xk + dxk) is rejected, the line search should try the point (xk + alpha * dxk), where alpha < 1.? But due to the copying, what happens is that the point ((xk + dxk) + alpha * dxk) is evaluated, see, e.g., https://petsc.org/release/src/tao/linesearch/impls/armijo/armijo.c.html line 191. Best regards Stephan K?hler -- Stephan K?hler TU Bergakademie Freiberg Institut f?r numerische Mathematik und Optimierung Akademiestra?e 6 09599 Freiberg Geb?udeteil Mittelbau, Zimmer 2.07 Telefon: +49 (0)3731 39-3173 (B?ro) -------------- next part -------------- A non-text attachment was scrubbed... Name: OpenPGP_0xC9BF2C20DFE9F713.asc Type: application/pgp-keys Size: 758 bytes Desc: OpenPGP public key URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: OpenPGP_signature Type: application/pgp-signature Size: 236 bytes Desc: OpenPGP digital signature URL: From mhyaqteen at sju.ac.kr Fri Oct 28 03:48:27 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Fri, 28 Oct 2022 08:48:27 +0000 Subject: [petsc-users] PETSc Windows Installation Message-ID: Dear Sir, During the Installation of PETSc in windows, I installed Cygwin and the required libraries as mentioned on your website: [cid:image002.png at 01D8EAF4.F4004D90] However, when I install PETSc using the configure commands present on the petsc website: ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 --download-f2cblaslapack --download-mpich it gives me the following error: [cid:image003.png at 01D8EAF5.7A5C0E10] I already installed OpenMPI using Cygwin installer but it still asks me to. When I configure without "-download-mpich" and run "make check" command, it gives me the following errors: [cid:image001.png at 01D8EAF1.6E65C190] Could you kindly look into this and help me with this? Your prompt response will highly be appreciated. Thank you! Mohammad Ali Researcher, Sejong University -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.png Type: image/png Size: 13777 bytes Desc: image001.png URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image002.png Type: image/png Size: 21436 bytes Desc: image002.png URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image003.png Type: image/png Size: 8028 bytes Desc: image003.png URL: From knepley at gmail.com Fri Oct 28 08:30:31 2022 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 28 Oct 2022 09:30:31 -0400 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: Message-ID: On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen wrote: > Dear Sir, > > > > During the Installation of PETSc in windows, I installed Cygwin and the > required libraries as mentioned on your website: > > However, when I install PETSc using the configure commands present on the > petsc website: > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > --download-f2cblaslapack --download-mpich > > > > it gives me the following error: > > > > > > I already installed OpenMPI using Cygwin installer but it still asks me > to. When I configure without ??download-mpich? and run ?make check? > command, it gives me the following errors: > > > > > > Could you kindly look into this and help me with this? Your prompt > response will highly be appreciated. > The runs look fine. The test should not try to attach the debugger. Do you have that in the PETSC_OPTIONS env variable? Thanks, Matt > Thank you! > > Mohammad Ali > > Researcher, Sejong University > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.png Type: image/png Size: 13777 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image002.png Type: image/png Size: 21436 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image003.png Type: image/png Size: 8028 bytes Desc: not available URL: From bsmith at petsc.dev Fri Oct 28 15:24:44 2022 From: bsmith at petsc.dev (Barry Smith) Date: Fri, 28 Oct 2022 16:24:44 -0400 Subject: [petsc-users] Report Bug TaoALMM class In-Reply-To: <4eec06f9-d534-7a02-9abe-6d1415f663f0@math.tu-freiberg.de> References: <4eec06f9-d534-7a02-9abe-6d1415f663f0@math.tu-freiberg.de> Message-ID: Stephan, Thanks for your detailed report. Do you have a reproducing example? I am having trouble following the logic you indicate below. It is copying the P into auglag->P. Is auglag->P the "the current solution" you are referring to? Is it because of the line PetscCall(TaoSetSolution(auglag->subsolver, auglag->P))? So I am assuming that you mean the current solution of the auglag->subsolver Tao? This means that TaoALMMSubsolverObjective_Private() always sets the subsolver Tao current solution to the input P. Are you saying this is flawed logic in the implementation of the entire ALMM solver? Since the auglag->P gets overwritten for every TaoALMMSubsolverObjective_Private() with the new P passed in I don't see where ((xk + dxk) + alpha * dxk) would occur. Would it first have xk + dxk passed in for P and then the next time have xk + alpha * dxk passed in? Barry PetscErrorCode TaoALMMSubsolverObjective_Private(Tao tao, Vec P, PetscReal *Lval, void *ctx) { TAO_ALMM *auglag = (TAO_ALMM *)ctx; PetscFunctionBegin; PetscCall(VecCopy(P, auglag->P)); PetscCall((*auglag->sub_obj)(auglag->parent)); *Lval = auglag->Lval; PetscFunctionReturn(0); } > On Oct 28, 2022, at 4:27 AM, Stephan K?hler wrote: > > Dear PETSc/Tao team, > > it seems to be that there is a bug in the TaoALMM class: > > In the methods TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private the vector where the function value for the augmented Lagrangian is evaluate > is copied into the current solution, see, e.g., https://petsc.org/release/src/tao/constrained/impls/almm/almm.c.html line 672 or 682. This causes subsolver routine to not converge if the line search for the subsolver rejects the step length 1. for some > update. In detail: > > Suppose the current iterate is xk and the current update is dxk. The line search evaluates the augmented Lagrangian now at (xk + dxk). This causes that the value (xk + dxk) is copied in the current solution. If the point (xk + dxk) is rejected, the line search should > try the point (xk + alpha * dxk), where alpha < 1. But due to the copying, what happens is that the point ((xk + dxk) + alpha * dxk) is evaluated, see, e.g., https://petsc.org/release/src/tao/linesearch/impls/armijo/armijo.c.html line 191. > > Best regards > Stephan K?hler > > -- > Stephan K?hler > TU Bergakademie Freiberg > Institut f?r numerische Mathematik und Optimierung > > Akademiestra?e 6 > 09599 Freiberg > Geb?udeteil Mittelbau, Zimmer 2.07 > > Telefon: +49 (0)3731 39-3173 (B?ro) > > From mhyaqteen at sju.ac.kr Fri Oct 28 22:39:31 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Sat, 29 Oct 2022 03:39:31 +0000 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: Message-ID: I haven?t accessed PETSC or given any command of my own. I was just installing by following the instructions. I don?t know why it is attaching the debugger. Although it says ?Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process? which I think is indicating of missing of MPI! From: Matthew Knepley Sent: Friday, October 28, 2022 10:31 PM To: Mohammad Ali Yaqteen Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] PETSc Windows Installation On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen > wrote: Dear Sir, During the Installation of PETSc in windows, I installed Cygwin and the required libraries as mentioned on your website: [cid:image001.png at 01D8EB93.7C17E410] However, when I install PETSc using the configure commands present on the petsc website: ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 --download-f2cblaslapack --download-mpich it gives me the following error: [cid:image002.png at 01D8EB93.7C17E410] I already installed OpenMPI using Cygwin installer but it still asks me to. When I configure without ??download-mpich? and run ?make check? command, it gives me the following errors: [cid:image003.png at 01D8EB93.7C17E410] Could you kindly look into this and help me with this? Your prompt response will highly be appreciated. The runs look fine. The test should not try to attach the debugger. Do you have that in the PETSC_OPTIONS env variable? Thanks, Matt Thank you! Mohammad Ali Researcher, Sejong University -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.png Type: image/png Size: 21436 bytes Desc: image001.png URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image002.png Type: image/png Size: 8028 bytes Desc: image002.png URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image003.png Type: image/png Size: 13777 bytes Desc: image003.png URL: From balay at mcs.anl.gov Sat Oct 29 00:05:33 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Sat, 29 Oct 2022 00:05:33 -0500 (CDT) Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: Message-ID: <489db548-0d40-fef0-3b7e-dca30aedf4e6@mcs.anl.gov> With cygwin-openmpi the examples ran fine - you can ignore the extra message that causes grief with diff. But you should be able to use it. --download-mpich doesn't work on windows anymore. Satish On Fri, 28 Oct 2022, Mohammad Ali Yaqteen wrote: > Dear Sir, > > During the Installation of PETSc in windows, I installed Cygwin and the required libraries as mentioned on your website: > [cid:image002.png at 01D8EAF4.F4004D90] > However, when I install PETSc using the configure commands present on the petsc website: > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 --download-f2cblaslapack --download-mpich > > it gives me the following error: > > [cid:image003.png at 01D8EAF5.7A5C0E10] > > I already installed OpenMPI using Cygwin installer but it still asks me to. When I configure without "-download-mpich" and run "make check" command, it gives me the following errors: > > [cid:image001.png at 01D8EAF1.6E65C190] > > Could you kindly look into this and help me with this? Your prompt response will highly be appreciated. > > Thank you! > Mohammad Ali > Researcher, Sejong University > From balay at mcs.anl.gov Sat Oct 29 00:10:33 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Sat, 29 Oct 2022 00:10:33 -0500 (CDT) Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: Message-ID: On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > I haven?t accessed PETSC or given any command of my own. I was just installing by following the instructions. I don?t know why it is attaching the debugger. Although it says ?Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process? which I think is indicating of missing of MPI! The diff is not smart enough to detect the extra message from cygwin/OpenMPI - hence it assumes there is a potential problem - and prints the above message. But you can assume its installed properly - and use it. Satish > > From: Matthew Knepley > Sent: Friday, October 28, 2022 10:31 PM > To: Mohammad Ali Yaqteen > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] PETSc Windows Installation > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen > wrote: > Dear Sir, > > During the Installation of PETSc in windows, I installed Cygwin and the required libraries as mentioned on your website: > [cid:image001.png at 01D8EB93.7C17E410] > However, when I install PETSc using the configure commands present on the petsc website: > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 --download-f2cblaslapack --download-mpich > > it gives me the following error: > > [cid:image002.png at 01D8EB93.7C17E410] > > I already installed OpenMPI using Cygwin installer but it still asks me to. When I configure without ??download-mpich? and run ?make check? command, it gives me the following errors: > > [cid:image003.png at 01D8EB93.7C17E410] > > Could you kindly look into this and help me with this? Your prompt response will highly be appreciated. > > The runs look fine. > > The test should not try to attach the debugger. Do you have that in the PETSC_OPTIONS env variable? > > Thanks, > > Matt > > Thank you! > Mohammad Ali > Researcher, Sejong University > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > From carl-johan.thore at liu.se Sun Oct 30 10:02:47 2022 From: carl-johan.thore at liu.se (Carl-Johan Thore) Date: Sun, 30 Oct 2022 15:02:47 +0000 Subject: [petsc-users] KSP on GPU In-Reply-To: References: Message-ID: Hi, I'm solving a topology optimization problem with Stokes flow discretized by a stabilized Q1-Q0 finite element method and using BiCGStab with the fieldsplit preconditioner to solve the linear systems. The implementation is based on DMStag, runs on Ubuntu via WSL2, and works fine with PETSc-3.18.1 on multiple CPU cores and the following options for the preconditioner: -fieldsplit_0_ksp_type preonly \ -fieldsplit_0_pc_type gamg \ -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ -fieldsplit_1_ksp_type preonly \ -fieldsplit_1_pc_type jacobi However, when I enable GPU computations by adding two options - ... -dm_vec_type cuda \ -dm_mat_type aijcusparse \ -fieldsplit_0_ksp_type preonly \ -fieldsplit_0_pc_type gamg \ -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ -fieldsplit_1_ksp_type preonly \ -fieldsplit_1_pc_type jacobi - KSP still works fine the first couple of topology optimization iterations but then stops with "Linear solve did not converge due to DIVERGED_DTOL ..". My question is whether I should expect the GPU versions of the linear solvers and pre-conditioners to function exactly as their CPU counterparts (I got this impression from the documentation), in which case I've probably made some mistake in my own code, or whether there are other/additional settings or modifications I should use to run on the GPU (an NVIDIA Quadro T2000)? Kind regards, Carl-Johan -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Sun Oct 30 14:52:30 2022 From: bsmith at petsc.dev (Barry Smith) Date: Sun, 30 Oct 2022 15:52:30 -0400 Subject: [petsc-users] KSP on GPU In-Reply-To: References: Message-ID: In general you should expect similar but not identical conference behavior. I suggest running with all the monitoring you can. -ksp_monitor_true_residual -fieldsplit_0_monitor_true_residual -fieldsplit_1_monitor_true_residual and compare the various convergence between the CPU and GPU. Also run with -ksp_view and check that the various solver options are the same (they should be). Barry > On Oct 30, 2022, at 11:02 AM, Carl-Johan Thore via petsc-users wrote: > > Hi, > > I'm solving a topology optimization problem with Stokes flow discretized by a stabilized Q1-Q0 finite element method > and using BiCGStab with the fieldsplit preconditioner to solve the linear systems. The implementation > is based on DMStag, runs on Ubuntu via WSL2, and works fine with PETSc-3.18.1 on multiple CPU cores and the following > options for the preconditioner: > > -fieldsplit_0_ksp_type preonly \ > -fieldsplit_0_pc_type gamg \ > -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ > -fieldsplit_1_ksp_type preonly \ > -fieldsplit_1_pc_type jacobi > > However, when I enable GPU computations by adding two options - > > ... > -dm_vec_type cuda \ > -dm_mat_type aijcusparse \ > -fieldsplit_0_ksp_type preonly \ > -fieldsplit_0_pc_type gamg \ > -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ > -fieldsplit_1_ksp_type preonly \ > -fieldsplit_1_pc_type jacobi > > - KSP still works fine the first couple of topology optimization iterations but then > stops with "Linear solve did not converge due to DIVERGED_DTOL ..". > > My question is whether I should expect the GPU versions of the linear solvers and pre-conditioners > to function exactly as their CPU counterparts (I got this impression from the documentation), > in which case I've probably made some mistake in my own code, or whether there are other/additional > settings or modifications I should use to run on the GPU (an NVIDIA Quadro T2000)? > > Kind regards, > > Carl-Johan -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Oct 30 15:59:42 2022 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 30 Oct 2022 16:59:42 -0400 Subject: [petsc-users] KSP on GPU In-Reply-To: References: Message-ID: On Sun, Oct 30, 2022 at 3:52 PM Barry Smith wrote: > > In general you should expect similar but not identical conference > behavior. > > I suggest running with all the monitoring you can. > -ksp_monitor_true_residual > -fieldsplit_0_monitor_true_residual -fieldsplit_1_monitor_true_residual and > compare the various convergence between the CPU and GPU. Also run with > -ksp_view and check that the various solver options are the same (they > should be). > Is the GPU using float or double? Matt > Barry > > > On Oct 30, 2022, at 11:02 AM, Carl-Johan Thore via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > Hi, > > I'm solving a topology optimization problem with Stokes flow discretized > by a stabilized Q1-Q0 finite element method > and using BiCGStab with the fieldsplit preconditioner to solve the linear > systems. The implementation > is based on DMStag, runs on Ubuntu via WSL2, and works fine with > PETSc-3.18.1 on multiple CPU cores and the following > options for the preconditioner: > > -fieldsplit_0_ksp_type preonly \ > -fieldsplit_0_pc_type gamg \ > -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ > -fieldsplit_1_ksp_type preonly \ > -fieldsplit_1_pc_type jacobi > > However, when I enable GPU computations by adding two options - > > ... > -dm_vec_type cuda \ > -dm_mat_type aijcusparse \ > -fieldsplit_0_ksp_type preonly \ > -fieldsplit_0_pc_type gamg \ > -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ > -fieldsplit_1_ksp_type preonly \ > -fieldsplit_1_pc_type jacobi > > - KSP still works fine the first couple of topology optimization > iterations but then > stops with "Linear solve did not converge due to DIVERGED_DTOL ..". > > My question is whether I should expect the GPU versions of the linear > solvers and pre-conditioners > to function exactly as their CPU counterparts (I got this impression from > the documentation), > in which case I've probably made some mistake in my own code, or whether > there are other/additional > settings or modifications I should use to run on the GPU (an NVIDIA Quadro > T2000)? > > Kind regards, > > Carl-Johan > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From carl-johan.thore at liu.se Mon Oct 31 00:56:18 2022 From: carl-johan.thore at liu.se (Carl-Johan Thore) Date: Mon, 31 Oct 2022 05:56:18 +0000 Subject: [petsc-users] KSP on GPU In-Reply-To: References: Message-ID: The GPU supports double precision and I didn't explicitly tell PETSc to use float when compiling, so I guess it uses double? What's the easiest way to check? Barry, running -ksp_view shows that the solver options are the same for CPU and GPU. The only difference is the coarse grid solver for gamg ("the package used to perform factorization:") which is petsc for CPU and cusparse for GPU. I tried forcing the GPU to use petsc via -fieldsplit_0_mg_coarse_sub_pc_factor_mat_solver_type, but then ksp failed to converge even on the first topology optimization iteration. -ksp_view also shows differences in the eigenvalues from the Chebyshev smoother. For example, GPU: Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_2_) 1 MPI process type: chebyshev eigenvalue targets used: min 0.109245, max 1.2017 eigenvalues provided (min 0.889134, max 1.09245) with CPU: eigenvalue targets used: min 0.112623, max 1.23886 eigenvalues provided (min 0.879582, max 1.12623) But I guess such differences are expected? /Carl-Johan From: Matthew Knepley Sent: den 30 oktober 2022 22:00 To: Barry Smith Cc: Carl-Johan Thore ; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] KSP on GPU On Sun, Oct 30, 2022 at 3:52 PM Barry Smith > wrote: In general you should expect similar but not identical conference behavior. I suggest running with all the monitoring you can. -ksp_monitor_true_residual -fieldsplit_0_monitor_true_residual -fieldsplit_1_monitor_true_residual and compare the various convergence between the CPU and GPU. Also run with -ksp_view and check that the various solver options are the same (they should be). Is the GPU using float or double? Matt Barry On Oct 30, 2022, at 11:02 AM, Carl-Johan Thore via petsc-users > wrote: Hi, I'm solving a topology optimization problem with Stokes flow discretized by a stabilized Q1-Q0 finite element method and using BiCGStab with the fieldsplit preconditioner to solve the linear systems. The implementation is based on DMStag, runs on Ubuntu via WSL2, and works fine with PETSc-3.18.1 on multiple CPU cores and the following options for the preconditioner: -fieldsplit_0_ksp_type preonly \ -fieldsplit_0_pc_type gamg \ -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ -fieldsplit_1_ksp_type preonly \ -fieldsplit_1_pc_type jacobi However, when I enable GPU computations by adding two options - ... -dm_vec_type cuda \ -dm_mat_type aijcusparse \ -fieldsplit_0_ksp_type preonly \ -fieldsplit_0_pc_type gamg \ -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ -fieldsplit_1_ksp_type preonly \ -fieldsplit_1_pc_type jacobi - KSP still works fine the first couple of topology optimization iterations but then stops with "Linear solve did not converge due to DIVERGED_DTOL ..". My question is whether I should expect the GPU versions of the linear solvers and pre-conditioners to function exactly as their CPU counterparts (I got this impression from the documentation), in which case I've probably made some mistake in my own code, or whether there are other/additional settings or modifications I should use to run on the GPU (an NVIDIA Quadro T2000)? Kind regards, Carl-Johan -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mhyaqteen at sju.ac.kr Mon Oct 31 00:56:53 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Mon, 31 Oct 2022 05:56:53 +0000 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: Message-ID: Dear Satish When I configure PETSc with (./configure --with-cc=gcc --with-cxx=0 --with-fc=0 --download-f2cblaslapack) it runs as I shared initially which you said is not an issue anymore. But when I add (--download-scalapack --download-mumps) or configure with these later, it gives the following error: $ ./configure --download-scalapack --download-mumps ============================================================================================= Configuring PETSc to compile on your system ============================================================================================= TESTING: FortranMPICheck from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- Fortran error! mpi_init() could not be located! ******************************************************************************* What could be the problem here? Your help is highly appreciated. Thank you Ali -----Original Message----- From: Satish Balay Sent: Saturday, October 29, 2022 2:11 PM To: Mohammad Ali Yaqteen Cc: Matthew Knepley ; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] PETSc Windows Installation On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > I haven?t accessed PETSC or given any command of my own. I was just installing by following the instructions. I don?t know why it is attaching the debugger. Although it says ?Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process? which I think is indicating of missing of MPI! The diff is not smart enough to detect the extra message from cygwin/OpenMPI - hence it assumes there is a potential problem - and prints the above message. But you can assume its installed properly - and use it. Satish > > From: Matthew Knepley > Sent: Friday, October 28, 2022 10:31 PM > To: Mohammad Ali Yaqteen > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] PETSc Windows Installation > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen > wrote: > Dear Sir, > > During the Installation of PETSc in windows, I installed Cygwin and the required libraries as mentioned on your website: > [cid:image001.png at 01D8EB93.7C17E410] > However, when I install PETSc using the configure commands present on the petsc website: > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 --download-f2cblaslapack --download-mpich > > it gives me the following error: > > [cid:image002.png at 01D8EB93.7C17E410] > > I already installed OpenMPI using Cygwin installer but it still asks me to. When I configure without ??download-mpich? and run ?make check? command, it gives me the following errors: > > [cid:image003.png at 01D8EB93.7C17E410] > > Could you kindly look into this and help me with this? Your prompt response will highly be appreciated. > > The runs look fine. > > The test should not try to attach the debugger. Do you have that in the PETSC_OPTIONS env variable? > > Thanks, > > Matt > > Thank you! > Mohammad Ali > Researcher, Sejong University > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > From knepley at gmail.com Mon Oct 31 05:58:49 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 31 Oct 2022 06:58:49 -0400 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: Message-ID: On Mon, Oct 31, 2022 at 1:56 AM Mohammad Ali Yaqteen wrote: > Dear Satish > > When I configure PETSc with (./configure --with-cc=gcc --with-cxx=0 > --with-fc=0 --download-f2cblaslapack) it runs as I shared initially which > you said is not an issue anymore. But when I add (--download-scalapack > --download-mumps) or configure with these later, it gives the following > error: > > $ ./configure --download-scalapack --download-mumps > > ============================================================================================= > Configuring PETSc to compile on your system > > ============================================================================================= > TESTING: FortranMPICheck from > config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > details): > > ------------------------------------------------------------------------------- > Fortran error! mpi_init() could not be located! > > ******************************************************************************* > > What could be the problem here? > Without configure.log we cannot tell what went wrong. However, from the error message, I would guess that your MPI was not built with Fortran bindings. You need these for those packages. Thanks, Matt > Your help is highly appreciated. > > Thank you > Ali > > -----Original Message----- > From: Satish Balay > Sent: Saturday, October 29, 2022 2:11 PM > To: Mohammad Ali Yaqteen > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] PETSc Windows Installation > > On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > > > I haven?t accessed PETSC or given any command of my own. I was just > installing by following the instructions. I don?t know why it is attaching > the debugger. Although it says ?Possible error running C/C++ > src/snes/tutorials/ex19 with 1 MPI process? which I think is indicating of > missing of MPI! > > The diff is not smart enough to detect the extra message from > cygwin/OpenMPI - hence it assumes there is a potential problem - and prints > the above message. > > But you can assume its installed properly - and use it. > > Satish > > > > From: Matthew Knepley > > Sent: Friday, October 28, 2022 10:31 PM > > To: Mohammad Ali Yaqteen > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen < > mhyaqteen at sju.ac.kr> wrote: > > Dear Sir, > > > > During the Installation of PETSc in windows, I installed Cygwin and the > required libraries as mentioned on your website: > > [cid:image001.png at 01D8EB93.7C17E410] > > However, when I install PETSc using the configure commands present on > the petsc website: > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > --download-f2cblaslapack --download-mpich > > > > it gives me the following error: > > > > [cid:image002.png at 01D8EB93.7C17E410] > > > > I already installed OpenMPI using Cygwin installer but it still asks me > to. When I configure without ??download-mpich? and run ?make check? > command, it gives me the following errors: > > > > [cid:image003.png at 01D8EB93.7C17E410] > > > > Could you kindly look into this and help me with this? Your prompt > response will highly be appreciated. > > > > The runs look fine. > > > > The test should not try to attach the debugger. Do you have that in the > PETSC_OPTIONS env variable? > > > > Thanks, > > > > Matt > > > > Thank you! > > Mohammad Ali > > Researcher, Sejong University > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/< > http://www.cse.buffalo.edu/~knepley/> > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Oct 31 07:30:29 2022 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 31 Oct 2022 08:30:29 -0400 Subject: [petsc-users] KSP on GPU In-Reply-To: References: Message-ID: * You could try hypre or another preconditioner that you can afford, like LU or ASM, that works. * If this matrix is SPD, you want to use -fieldsplit_0_pc_gamg_esteig_ksp_type cg -fieldsplit_0_pc_gamg_esteig_ksp_max_it 10 These will give better eigen estimates, and that is important. The differences between these steimates is not too bad. There is a safety factor (1.05 is the default) that you could increase with: -fieldsplit_0_mg_levels_ksp_chebyshev_esteig 0,0.05,0,*1.1* * Finally you could try -fieldsplit_0_pc_gamg_reuse_interpolation 1, if GAMG is still not working. Use -fieldsplit_0_ksp_converged_reason and check the iteration count. And it is a good idea to check with hypre to make sure something is not going badly in terms of performance anyway. AMG is hard and hypre is a good solver. Mark On Mon, Oct 31, 2022 at 1:56 AM Carl-Johan Thore via petsc-users < petsc-users at mcs.anl.gov> wrote: > The GPU supports double precision and I didn?t explicitly tell PETSc to > use float when compiling, so > > I guess it uses double? What?s the easiest way to check? > > > > Barry, running -ksp_view shows that the solver options are the same for > CPU and GPU. The only > > difference is the coarse grid solver for gamg (?the package used to > perform factorization:?) which > > is petsc for CPU and cusparse for GPU. I tried forcing the GPU to use > petsc via > > -fieldsplit_0_mg_coarse_sub_pc_factor_mat_solver_type, but then ksp failed > to converge > > even on the first topology optimization iteration. > > > > -ksp_view also shows differences in the eigenvalues from the Chebyshev > smoother. For example, > > > > GPU: > > Down solver (pre-smoother) on level 2 ------------------------------- > > KSP Object: (fieldsplit_0_mg_levels_2_) 1 MPI process > > type: chebyshev > > eigenvalue targets used: min 0.109245, max 1.2017 > > eigenvalues provided (min 0.889134, max 1.09245) with > > > > CPU: > > eigenvalue targets used: min 0.112623, max 1.23886 > > eigenvalues provided (min 0.879582, max 1.12623) > > > > But I guess such differences are expected? > > > > /Carl-Johan > > > > *From:* Matthew Knepley > *Sent:* den 30 oktober 2022 22:00 > *To:* Barry Smith > *Cc:* Carl-Johan Thore ; petsc-users at mcs.anl.gov > *Subject:* Re: [petsc-users] KSP on GPU > > > > On Sun, Oct 30, 2022 at 3:52 PM Barry Smith wrote: > > > > In general you should expect similar but not identical conference > behavior. > > > > I suggest running with all the monitoring you can. > -ksp_monitor_true_residual > -fieldsplit_0_monitor_true_residual -fieldsplit_1_monitor_true_residual and > compare the various convergence between the CPU and GPU. Also run with > -ksp_view and check that the various solver options are the same (they > should be). > > > > Is the GPU using float or double? > > > > Matt > > > > Barry > > > > > > On Oct 30, 2022, at 11:02 AM, Carl-Johan Thore via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > Hi, > > > > I'm solving a topology optimization problem with Stokes flow discretized > by a stabilized Q1-Q0 finite element method > > and using BiCGStab with the fieldsplit preconditioner to solve the linear > systems. The implementation > > is based on DMStag, runs on Ubuntu via WSL2, and works fine with > PETSc-3.18.1 on multiple CPU cores and the following > > options for the preconditioner: > > > > -fieldsplit_0_ksp_type preonly \ > > -fieldsplit_0_pc_type gamg \ > > -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ > > -fieldsplit_1_ksp_type preonly \ > > -fieldsplit_1_pc_type jacobi > > > > However, when I enable GPU computations by adding two options - > > > > ... > > -dm_vec_type cuda \ > > -dm_mat_type aijcusparse \ > > -fieldsplit_0_ksp_type preonly \ > > -fieldsplit_0_pc_type gamg \ > > -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ > > -fieldsplit_1_ksp_type preonly \ > > -fieldsplit_1_pc_type jacobi > > > > - KSP still works fine the first couple of topology optimization > iterations but then > > stops with "Linear solve did not converge due to DIVERGED_DTOL ..". > > > > My question is whether I should expect the GPU versions of the linear > solvers and pre-conditioners > > to function exactly as their CPU counterparts (I got this impression from > the documentation), > > in which case I've probably made some mistake in my own code, or whether > there are other/additional > > settings or modifications I should use to run on the GPU (an NVIDIA Quadro > T2000)? > > > > Kind regards, > > > > Carl-Johan > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon Oct 31 08:34:31 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 31 Oct 2022 08:34:31 -0500 (CDT) Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: Message-ID: <2db12320-25ab-7911-4bb6-ff0195f5ffdc@mcs.anl.gov> Make sure you have cygwin openmpi installed [and cywin blas/lapack] $ cygcheck -cd |grep openmpi libopenmpi-devel 4.1.2-1 libopenmpi40 4.1.2-1 libopenmpifh40 4.1.2-1 libopenmpiusef08_40 4.1.2-1 libopenmpiusetkr40 4.1.2-1 openmpi 4.1.2-1 $ cygcheck -cd |grep lapack liblapack-devel 3.10.1-1 liblapack0 3.10.1-1 > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 --download-f2cblaslapack Should be: > > $ ./configure --download-scalapack --download-mumps i.e [default] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 [an default cygwin blas/lapack] Satish On Mon, 31 Oct 2022, Matthew Knepley wrote: > On Mon, Oct 31, 2022 at 1:56 AM Mohammad Ali Yaqteen > wrote: > > > Dear Satish > > > > When I configure PETSc with (./configure --with-cc=gcc --with-cxx=0 > > --with-fc=0 --download-f2cblaslapack) it runs as I shared initially which > > you said is not an issue anymore. But when I add (--download-scalapack > > --download-mumps) or configure with these later, it gives the following > > error: > > > > $ ./configure --download-scalapack --download-mumps > > > > ============================================================================================= > > Configuring PETSc to compile on your system > > > > ============================================================================================= > > TESTING: FortranMPICheck from > > config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > details): > > > > ------------------------------------------------------------------------------- > > Fortran error! mpi_init() could not be located! > > > > ******************************************************************************* > > > > What could be the problem here? > > > > Without configure.log we cannot tell what went wrong. However, from the > error message, I would guess that your MPI > was not built with Fortran bindings. You need these for those packages. > > Thanks, > > Matt > > > > Your help is highly appreciated. > > > > Thank you > > Ali > > > > -----Original Message----- > > From: Satish Balay > > Sent: Saturday, October 29, 2022 2:11 PM > > To: Mohammad Ali Yaqteen > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > > > > > I haven?t accessed PETSC or given any command of my own. I was just > > installing by following the instructions. I don?t know why it is attaching > > the debugger. Although it says ?Possible error running C/C++ > > src/snes/tutorials/ex19 with 1 MPI process? which I think is indicating of > > missing of MPI! > > > > The diff is not smart enough to detect the extra message from > > cygwin/OpenMPI - hence it assumes there is a potential problem - and prints > > the above message. > > > > But you can assume its installed properly - and use it. > > > > Satish > > > > > > From: Matthew Knepley > > > Sent: Friday, October 28, 2022 10:31 PM > > > To: Mohammad Ali Yaqteen > > > Cc: petsc-users at mcs.anl.gov > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen < > > mhyaqteen at sju.ac.kr> wrote: > > > Dear Sir, > > > > > > During the Installation of PETSc in windows, I installed Cygwin and the > > required libraries as mentioned on your website: > > > [cid:image001.png at 01D8EB93.7C17E410] > > > However, when I install PETSc using the configure commands present on > > the petsc website: > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > --download-f2cblaslapack --download-mpich > > > > > > it gives me the following error: > > > > > > [cid:image002.png at 01D8EB93.7C17E410] > > > > > > I already installed OpenMPI using Cygwin installer but it still asks me > > to. When I configure without ??download-mpich? and run ?make check? > > command, it gives me the following errors: > > > > > > [cid:image003.png at 01D8EB93.7C17E410] > > > > > > Could you kindly look into this and help me with this? Your prompt > > response will highly be appreciated. > > > > > > The runs look fine. > > > > > > The test should not try to attach the debugger. Do you have that in the > > PETSC_OPTIONS env variable? > > > > > > Thanks, > > > > > > Matt > > > > > > Thank you! > > > Mohammad Ali > > > Researcher, Sejong University > > > > > > > > > -- > > > What most experimenters take for granted before they begin their > > experiments is infinitely more interesting than any results to which their > > experiments lead. > > > -- Norbert Wiener > > > > > > https://www.cse.buffalo.edu/~knepley/< > > http://www.cse.buffalo.edu/~knepley/> > > > > > > > > From balay at mcs.anl.gov Mon Oct 31 08:55:58 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 31 Oct 2022 08:55:58 -0500 (CDT) Subject: [petsc-users] PETSc Windows Installation In-Reply-To: <2db12320-25ab-7911-4bb6-ff0195f5ffdc@mcs.anl.gov> References: <2db12320-25ab-7911-4bb6-ff0195f5ffdc@mcs.anl.gov> Message-ID: BTW: If you have WSL2 on windows - it might be easier to build/use PETSc. Satish On Mon, 31 Oct 2022, Satish Balay via petsc-users wrote: > Make sure you have cygwin openmpi installed [and cywin blas/lapack] > > $ cygcheck -cd |grep openmpi > libopenmpi-devel 4.1.2-1 > libopenmpi40 4.1.2-1 > libopenmpifh40 4.1.2-1 > libopenmpiusef08_40 4.1.2-1 > libopenmpiusetkr40 4.1.2-1 > openmpi 4.1.2-1 > $ cygcheck -cd |grep lapack > liblapack-devel 3.10.1-1 > liblapack0 3.10.1-1 > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 --download-f2cblaslapack > > Should be: > > > > $ ./configure --download-scalapack --download-mumps > > i.e [default] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 [an default cygwin blas/lapack] > > Satish > > > On Mon, 31 Oct 2022, Matthew Knepley wrote: > > > On Mon, Oct 31, 2022 at 1:56 AM Mohammad Ali Yaqteen > > wrote: > > > > > Dear Satish > > > > > > When I configure PETSc with (./configure --with-cc=gcc --with-cxx=0 > > > --with-fc=0 --download-f2cblaslapack) it runs as I shared initially which > > > you said is not an issue anymore. But when I add (--download-scalapack > > > --download-mumps) or configure with these later, it gives the following > > > error: > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > ============================================================================================= > > > Configuring PETSc to compile on your system > > > > > > ============================================================================================= > > > TESTING: FortranMPICheck from > > > config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > > details): > > > > > > ------------------------------------------------------------------------------- > > > Fortran error! mpi_init() could not be located! > > > > > > ******************************************************************************* > > > > > > What could be the problem here? > > > > > > > Without configure.log we cannot tell what went wrong. However, from the > > error message, I would guess that your MPI > > was not built with Fortran bindings. You need these for those packages. > > > > Thanks, > > > > Matt > > > > > > > Your help is highly appreciated. > > > > > > Thank you > > > Ali > > > > > > -----Original Message----- > > > From: Satish Balay > > > Sent: Saturday, October 29, 2022 2:11 PM > > > To: Mohammad Ali Yaqteen > > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > > > > > > > I haven?t accessed PETSC or given any command of my own. I was just > > > installing by following the instructions. I don?t know why it is attaching > > > the debugger. Although it says ?Possible error running C/C++ > > > src/snes/tutorials/ex19 with 1 MPI process? which I think is indicating of > > > missing of MPI! > > > > > > The diff is not smart enough to detect the extra message from > > > cygwin/OpenMPI - hence it assumes there is a potential problem - and prints > > > the above message. > > > > > > But you can assume its installed properly - and use it. > > > > > > Satish > > > > > > > > From: Matthew Knepley > > > > Sent: Friday, October 28, 2022 10:31 PM > > > > To: Mohammad Ali Yaqteen > > > > Cc: petsc-users at mcs.anl.gov > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen < > > > mhyaqteen at sju.ac.kr> wrote: > > > > Dear Sir, > > > > > > > > During the Installation of PETSc in windows, I installed Cygwin and the > > > required libraries as mentioned on your website: > > > > [cid:image001.png at 01D8EB93.7C17E410] > > > > However, when I install PETSc using the configure commands present on > > > the petsc website: > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > --download-f2cblaslapack --download-mpich > > > > > > > > it gives me the following error: > > > > > > > > [cid:image002.png at 01D8EB93.7C17E410] > > > > > > > > I already installed OpenMPI using Cygwin installer but it still asks me > > > to. When I configure without ??download-mpich? and run ?make check? > > > command, it gives me the following errors: > > > > > > > > [cid:image003.png at 01D8EB93.7C17E410] > > > > > > > > Could you kindly look into this and help me with this? Your prompt > > > response will highly be appreciated. > > > > > > > > The runs look fine. > > > > > > > > The test should not try to attach the debugger. Do you have that in the > > > PETSC_OPTIONS env variable? > > > > > > > > Thanks, > > > > > > > > Matt > > > > > > > > Thank you! > > > > Mohammad Ali > > > > Researcher, Sejong University > > > > > > > > > > > > -- > > > > What most experimenters take for granted before they begin their > > > experiments is infinitely more interesting than any results to which their > > > experiments lead. > > > > -- Norbert Wiener > > > > > > > > https://www.cse.buffalo.edu/~knepley/< > > > http://www.cse.buffalo.edu/~knepley/> > > > > > > > > > > > > > > From bsmith at petsc.dev Mon Oct 31 09:06:36 2022 From: bsmith at petsc.dev (Barry Smith) Date: Mon, 31 Oct 2022 10:06:36 -0400 Subject: [petsc-users] KSP on GPU In-Reply-To: References: Message-ID: <524E4CA4-E996-4A5E-9B77-33621B794D32@petsc.dev> Please send the full output when you run with the monitors I mentioned turned on. If one approach is converging and one is not then we should be able to see this in differences in the convergence output printed for the two runs getting further and further apart. Barry > On Oct 31, 2022, at 1:56 AM, Carl-Johan Thore wrote: > > The GPU supports double precision and I didn?t explicitly tell PETSc to use float when compiling, so > I guess it uses double? What?s the easiest way to check? > > Barry, running -ksp_view shows that the solver options are the same for CPU and GPU. The only > difference is the coarse grid solver for gamg (?the package used to perform factorization:?) which > is petsc for CPU and cusparse for GPU. I tried forcing the GPU to use petsc via > -fieldsplit_0_mg_coarse_sub_pc_factor_mat_solver_type, but then ksp failed to converge > even on the first topology optimization iteration. > > -ksp_view also shows differences in the eigenvalues from the Chebyshev smoother. For example, > > GPU: > Down solver (pre-smoother) on level 2 ------------------------------- > KSP Object: (fieldsplit_0_mg_levels_2_) 1 MPI process > type: chebyshev > eigenvalue targets used: min 0.109245, max 1.2017 > eigenvalues provided (min 0.889134, max 1.09245) with > > CPU: > eigenvalue targets used: min 0.112623, max 1.23886 > eigenvalues provided (min 0.879582, max 1.12623) > > But I guess such differences are expected? > > /Carl-Johan > > From: Matthew Knepley > > Sent: den 30 oktober 2022 22:00 > To: Barry Smith > > Cc: Carl-Johan Thore >; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] KSP on GPU > > On Sun, Oct 30, 2022 at 3:52 PM Barry Smith > wrote: > > In general you should expect similar but not identical conference behavior. > > I suggest running with all the monitoring you can. -ksp_monitor_true_residual -fieldsplit_0_monitor_true_residual -fieldsplit_1_monitor_true_residual and compare the various convergence between the CPU and GPU. Also run with -ksp_view and check that the various solver options are the same (they should be). > > Is the GPU using float or double? > > Matt > > Barry > > > > On Oct 30, 2022, at 11:02 AM, Carl-Johan Thore via petsc-users > wrote: > > Hi, > > I'm solving a topology optimization problem with Stokes flow discretized by a stabilized Q1-Q0 finite element method > and using BiCGStab with the fieldsplit preconditioner to solve the linear systems. The implementation > is based on DMStag, runs on Ubuntu via WSL2, and works fine with PETSc-3.18.1 on multiple CPU cores and the following > options for the preconditioner: > > -fieldsplit_0_ksp_type preonly \ > -fieldsplit_0_pc_type gamg \ > -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ > -fieldsplit_1_ksp_type preonly \ > -fieldsplit_1_pc_type jacobi > > However, when I enable GPU computations by adding two options - > > ... > -dm_vec_type cuda \ > -dm_mat_type aijcusparse \ > -fieldsplit_0_ksp_type preonly \ > -fieldsplit_0_pc_type gamg \ > -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ > -fieldsplit_1_ksp_type preonly \ > -fieldsplit_1_pc_type jacobi > > - KSP still works fine the first couple of topology optimization iterations but then > stops with "Linear solve did not converge due to DIVERGED_DTOL ..". > > My question is whether I should expect the GPU versions of the linear solvers and pre-conditioners > to function exactly as their CPU counterparts (I got this impression from the documentation), > in which case I've probably made some mistake in my own code, or whether there are other/additional > settings or modifications I should use to run on the GPU (an NVIDIA Quadro T2000)? > > Kind regards, > > Carl-Johan > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mhyaqteen at sju.ac.kr Mon Oct 31 21:32:45 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Tue, 1 Nov 2022 02:32:45 +0000 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2db12320-25ab-7911-4bb6-ff0195f5ffdc@mcs.anl.gov> Message-ID: I have checked the required Cygwin openmpi libraries and they are all installed. When I run ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90, it returns: $ ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 ============================================================================================= Configuring PETSc to compile on your system ============================================================================================= TESTING: checkCCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- C compiler you provided with -with-cc=mpicc cannot be found or does not work. Cannot compile/link C with mpicc. As for the case of WSL2, I will try to install that on my PC. Meanwhile, could you please look into this issue Thank you Ali -----Original Message----- From: Satish Balay Sent: Monday, October 31, 2022 10:56 PM To: Satish Balay via petsc-users Cc: Matthew Knepley ; Mohammad Ali Yaqteen Subject: Re: [petsc-users] PETSc Windows Installation BTW: If you have WSL2 on windows - it might be easier to build/use PETSc. Satish On Mon, 31 Oct 2022, Satish Balay via petsc-users wrote: > Make sure you have cygwin openmpi installed [and cywin blas/lapack] > > $ cygcheck -cd |grep openmpi > libopenmpi-devel 4.1.2-1 > libopenmpi40 4.1.2-1 > libopenmpifh40 4.1.2-1 > libopenmpiusef08_40 4.1.2-1 > libopenmpiusetkr40 4.1.2-1 > openmpi 4.1.2-1 > $ cygcheck -cd |grep lapack > liblapack-devel 3.10.1-1 > liblapack0 3.10.1-1 > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > --download-f2cblaslapack > > Should be: > > > > $ ./configure --download-scalapack --download-mumps > > i.e [default] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 [an > default cygwin blas/lapack] > > Satish > > > On Mon, 31 Oct 2022, Matthew Knepley wrote: > > > On Mon, Oct 31, 2022 at 1:56 AM Mohammad Ali Yaqteen > > > > wrote: > > > > > Dear Satish > > > > > > When I configure PETSc with (./configure --with-cc=gcc > > > --with-cxx=0 > > > --with-fc=0 --download-f2cblaslapack) it runs as I shared > > > initially which you said is not an issue anymore. But when I add > > > (--download-scalapack > > > --download-mumps) or configure with these later, it gives the > > > following > > > error: > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > ============================================================================================= > > > Configuring PETSc to compile on your > > > system > > > > > > ================================================================== > > > =========================== > > > TESTING: FortranMPICheck from > > > config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > > details): > > > > > > ------------------------------------------------------------------ > > > ------------- Fortran error! mpi_init() could not be located! > > > > > > ****************************************************************** > > > ************* > > > > > > What could be the problem here? > > > > > > > Without configure.log we cannot tell what went wrong. However, from > > the error message, I would guess that your MPI was not built with > > Fortran bindings. You need these for those packages. > > > > Thanks, > > > > Matt > > > > > > > Your help is highly appreciated. > > > > > > Thank you > > > Ali > > > > > > -----Original Message----- > > > From: Satish Balay > > > Sent: Saturday, October 29, 2022 2:11 PM > > > To: Mohammad Ali Yaqteen > > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > > > > > > > I haven?t accessed PETSC or given any command of my own. I was > > > > just > > > installing by following the instructions. I don?t know why it is > > > attaching the debugger. Although it says ?Possible error running > > > C/C++ > > > src/snes/tutorials/ex19 with 1 MPI process? which I think is > > > indicating of missing of MPI! > > > > > > The diff is not smart enough to detect the extra message from > > > cygwin/OpenMPI - hence it assumes there is a potential problem - > > > and prints the above message. > > > > > > But you can assume its installed properly - and use it. > > > > > > Satish > > > > > > > > From: Matthew Knepley > > > > Sent: Friday, October 28, 2022 10:31 PM > > > > To: Mohammad Ali Yaqteen > > > > Cc: petsc-users at mcs.anl.gov > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen < > > > mhyaqteen at sju.ac.kr> wrote: > > > > Dear Sir, > > > > > > > > During the Installation of PETSc in windows, I installed Cygwin > > > > and the > > > required libraries as mentioned on your website: > > > > [cid:image001.png at 01D8EB93.7C17E410] > > > > However, when I install PETSc using the configure commands > > > > present on > > > the petsc website: > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > --download-f2cblaslapack --download-mpich > > > > > > > > it gives me the following error: > > > > > > > > [cid:image002.png at 01D8EB93.7C17E410] > > > > > > > > I already installed OpenMPI using Cygwin installer but it still > > > > asks me > > > to. When I configure without ??download-mpich? and run ?make check? > > > command, it gives me the following errors: > > > > > > > > [cid:image003.png at 01D8EB93.7C17E410] > > > > > > > > Could you kindly look into this and help me with this? Your > > > > prompt > > > response will highly be appreciated. > > > > > > > > The runs look fine. > > > > > > > > The test should not try to attach the debugger. Do you have that > > > > in the > > > PETSC_OPTIONS env variable? > > > > > > > > Thanks, > > > > > > > > Matt > > > > > > > > Thank you! > > > > Mohammad Ali > > > > Researcher, Sejong University > > > > > > > > > > > > -- > > > > What most experimenters take for granted before they begin their > > > experiments is infinitely more interesting than any results to > > > which their experiments lead. > > > > -- Norbert Wiener > > > > > > > > https://www.cse.buffalo.edu/~knepley/< > > > http://www.cse.buffalo.edu/~knepley/> > > > > > > > > > > > > > > From balay at mcs.anl.gov Mon Oct 31 21:35:46 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 31 Oct 2022 21:35:46 -0500 (CDT) Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2db12320-25ab-7911-4bb6-ff0195f5ffdc@mcs.anl.gov> Message-ID: <461d2b54-173d-95fa-6ad5-9ce81849871e@mcs.anl.gov> you'll have to send configure.log for this failure Satish On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > I have checked the required Cygwin openmpi libraries and they are all installed. When I run ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90, it returns: > > $ ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > ============================================================================================= > Configuring PETSc to compile on your system > ============================================================================================= > TESTING: checkCCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > ------------------------------------------------------------------------------- > C compiler you provided with -with-cc=mpicc cannot be found or does not work. > Cannot compile/link C with mpicc. > > As for the case of WSL2, I will try to install that on my PC. Meanwhile, could you please look into this issue > > Thank you > > Ali > > -----Original Message----- > From: Satish Balay > Sent: Monday, October 31, 2022 10:56 PM > To: Satish Balay via petsc-users > Cc: Matthew Knepley ; Mohammad Ali Yaqteen > Subject: Re: [petsc-users] PETSc Windows Installation > > BTW: If you have WSL2 on windows - it might be easier to build/use PETSc. > > Satish > > On Mon, 31 Oct 2022, Satish Balay via petsc-users wrote: > > > Make sure you have cygwin openmpi installed [and cywin blas/lapack] > > > > $ cygcheck -cd |grep openmpi > > libopenmpi-devel 4.1.2-1 > > libopenmpi40 4.1.2-1 > > libopenmpifh40 4.1.2-1 > > libopenmpiusef08_40 4.1.2-1 > > libopenmpiusetkr40 4.1.2-1 > > openmpi 4.1.2-1 > > $ cygcheck -cd |grep lapack > > liblapack-devel 3.10.1-1 > > liblapack0 3.10.1-1 > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > --download-f2cblaslapack > > > > Should be: > > > > > > $ ./configure --download-scalapack --download-mumps > > > > i.e [default] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 [an > > default cygwin blas/lapack] > > > > Satish > > > > > > On Mon, 31 Oct 2022, Matthew Knepley wrote: > > > > > On Mon, Oct 31, 2022 at 1:56 AM Mohammad Ali Yaqteen > > > > > > wrote: > > > > > > > Dear Satish > > > > > > > > When I configure PETSc with (./configure --with-cc=gcc > > > > --with-cxx=0 > > > > --with-fc=0 --download-f2cblaslapack) it runs as I shared > > > > initially which you said is not an issue anymore. But when I add > > > > (--download-scalapack > > > > --download-mumps) or configure with these later, it gives the > > > > following > > > > error: > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > > > ============================================================================================= > > > > Configuring PETSc to compile on your > > > > system > > > > > > > > ================================================================== > > > > =========================== > > > > TESTING: FortranMPICheck from > > > > config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > > > details): > > > > > > > > ------------------------------------------------------------------ > > > > ------------- Fortran error! mpi_init() could not be located! > > > > > > > > ****************************************************************** > > > > ************* > > > > > > > > What could be the problem here? > > > > > > > > > > Without configure.log we cannot tell what went wrong. However, from > > > the error message, I would guess that your MPI was not built with > > > Fortran bindings. You need these for those packages. > > > > > > Thanks, > > > > > > Matt > > > > > > > > > > Your help is highly appreciated. > > > > > > > > Thank you > > > > Ali > > > > > > > > -----Original Message----- > > > > From: Satish Balay > > > > Sent: Saturday, October 29, 2022 2:11 PM > > > > To: Mohammad Ali Yaqteen > > > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > > > > > > > > > I haven?t accessed PETSC or given any command of my own. I was > > > > > just > > > > installing by following the instructions. I don?t know why it is > > > > attaching the debugger. Although it says ?Possible error running > > > > C/C++ > > > > src/snes/tutorials/ex19 with 1 MPI process? which I think is > > > > indicating of missing of MPI! > > > > > > > > The diff is not smart enough to detect the extra message from > > > > cygwin/OpenMPI - hence it assumes there is a potential problem - > > > > and prints the above message. > > > > > > > > But you can assume its installed properly - and use it. > > > > > > > > Satish > > > > > > > > > > From: Matthew Knepley > > > > > Sent: Friday, October 28, 2022 10:31 PM > > > > > To: Mohammad Ali Yaqteen > > > > > Cc: petsc-users at mcs.anl.gov > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen < > > > > mhyaqteen at sju.ac.kr> wrote: > > > > > Dear Sir, > > > > > > > > > > During the Installation of PETSc in windows, I installed Cygwin > > > > > and the > > > > required libraries as mentioned on your website: > > > > > [cid:image001.png at 01D8EB93.7C17E410] > > > > > However, when I install PETSc using the configure commands > > > > > present on > > > > the petsc website: > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > --download-f2cblaslapack --download-mpich > > > > > > > > > > it gives me the following error: > > > > > > > > > > [cid:image002.png at 01D8EB93.7C17E410] > > > > > > > > > > I already installed OpenMPI using Cygwin installer but it still > > > > > asks me > > > > to. When I configure without ??download-mpich? and run ?make check? > > > > command, it gives me the following errors: > > > > > > > > > > [cid:image003.png at 01D8EB93.7C17E410] > > > > > > > > > > Could you kindly look into this and help me with this? Your > > > > > prompt > > > > response will highly be appreciated. > > > > > > > > > > The runs look fine. > > > > > > > > > > The test should not try to attach the debugger. Do you have that > > > > > in the > > > > PETSC_OPTIONS env variable? > > > > > > > > > > Thanks, > > > > > > > > > > Matt > > > > > > > > > > Thank you! > > > > > Mohammad Ali > > > > > Researcher, Sejong University > > > > > > > > > > > > > > > -- > > > > > What most experimenters take for granted before they begin their > > > > experiments is infinitely more interesting than any results to > > > > which their experiments lead. > > > > > -- Norbert Wiener > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/< > > > > http://www.cse.buffalo.edu/~knepley/> > > > > > > > > > > > > > > > > > > > > > From mhyaqteen at sju.ac.kr Mon Oct 31 21:41:04 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Tue, 1 Nov 2022 02:41:04 +0000 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: <461d2b54-173d-95fa-6ad5-9ce81849871e@mcs.anl.gov> References: <2db12320-25ab-7911-4bb6-ff0195f5ffdc@mcs.anl.gov> <461d2b54-173d-95fa-6ad5-9ce81849871e@mcs.anl.gov> Message-ID: From where can I get that? Ali -----Original Message----- From: Satish Balay Sent: Tuesday, November 1, 2022 11:36 AM To: Mohammad Ali Yaqteen Cc: petsc-users Subject: RE: [petsc-users] PETSc Windows Installation you'll have to send configure.log for this failure Satish On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > I have checked the required Cygwin openmpi libraries and they are all installed. When I run ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90, it returns: > > $ ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > ============================================================================================= > Configuring PETSc to compile on your system > ====================================================================== > ======================= > TESTING: checkCCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > ---------------------------------------------------------------------- > --------- C compiler you provided with -with-cc=mpicc cannot be found > or does not work. > Cannot compile/link C with mpicc. > > As for the case of WSL2, I will try to install that on my PC. > Meanwhile, could you please look into this issue > > Thank you > > Ali > > -----Original Message----- > From: Satish Balay > Sent: Monday, October 31, 2022 10:56 PM > To: Satish Balay via petsc-users > Cc: Matthew Knepley ; Mohammad Ali Yaqteen > > Subject: Re: [petsc-users] PETSc Windows Installation > > BTW: If you have WSL2 on windows - it might be easier to build/use PETSc. > > Satish > > On Mon, 31 Oct 2022, Satish Balay via petsc-users wrote: > > > Make sure you have cygwin openmpi installed [and cywin blas/lapack] > > > > $ cygcheck -cd |grep openmpi > > libopenmpi-devel 4.1.2-1 > > libopenmpi40 4.1.2-1 > > libopenmpifh40 4.1.2-1 > > libopenmpiusef08_40 4.1.2-1 > > libopenmpiusetkr40 4.1.2-1 > > openmpi 4.1.2-1 > > $ cygcheck -cd |grep lapack > > liblapack-devel 3.10.1-1 > > liblapack0 3.10.1-1 > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > --download-f2cblaslapack > > > > Should be: > > > > > > $ ./configure --download-scalapack --download-mumps > > > > i.e [default] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 [an > > default cygwin blas/lapack] > > > > Satish > > > > > > On Mon, 31 Oct 2022, Matthew Knepley wrote: > > > > > On Mon, Oct 31, 2022 at 1:56 AM Mohammad Ali Yaqteen > > > > > > wrote: > > > > > > > Dear Satish > > > > > > > > When I configure PETSc with (./configure --with-cc=gcc > > > > --with-cxx=0 > > > > --with-fc=0 --download-f2cblaslapack) it runs as I shared > > > > initially which you said is not an issue anymore. But when I add > > > > (--download-scalapack > > > > --download-mumps) or configure with these later, it gives the > > > > following > > > > error: > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > > > ============================================================================================= > > > > Configuring PETSc to compile on your > > > > system > > > > > > > > ================================================================ > > > > == > > > > =========================== > > > > TESTING: FortranMPICheck from > > > > config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > > > details): > > > > > > > > ---------------------------------------------------------------- > > > > -- > > > > ------------- Fortran error! mpi_init() could not be located! > > > > > > > > **************************************************************** > > > > ** > > > > ************* > > > > > > > > What could be the problem here? > > > > > > > > > > Without configure.log we cannot tell what went wrong. However, > > > from the error message, I would guess that your MPI was not built > > > with Fortran bindings. You need these for those packages. > > > > > > Thanks, > > > > > > Matt > > > > > > > > > > Your help is highly appreciated. > > > > > > > > Thank you > > > > Ali > > > > > > > > -----Original Message----- > > > > From: Satish Balay > > > > Sent: Saturday, October 29, 2022 2:11 PM > > > > To: Mohammad Ali Yaqteen > > > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > > > > > > > > > I haven?t accessed PETSC or given any command of my own. I was > > > > > just > > > > installing by following the instructions. I don?t know why it is > > > > attaching the debugger. Although it says ?Possible error running > > > > C/C++ > > > > src/snes/tutorials/ex19 with 1 MPI process? which I think is > > > > indicating of missing of MPI! > > > > > > > > The diff is not smart enough to detect the extra message from > > > > cygwin/OpenMPI - hence it assumes there is a potential problem - > > > > and prints the above message. > > > > > > > > But you can assume its installed properly - and use it. > > > > > > > > Satish > > > > > > > > > > From: Matthew Knepley > > > > > Sent: Friday, October 28, 2022 10:31 PM > > > > > To: Mohammad Ali Yaqteen > > > > > Cc: petsc-users at mcs.anl.gov > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen < > > > > mhyaqteen at sju.ac.kr> wrote: > > > > > Dear Sir, > > > > > > > > > > During the Installation of PETSc in windows, I installed > > > > > Cygwin and the > > > > required libraries as mentioned on your website: > > > > > [cid:image001.png at 01D8EB93.7C17E410] > > > > > However, when I install PETSc using the configure commands > > > > > present on > > > > the petsc website: > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > --download-f2cblaslapack --download-mpich > > > > > > > > > > it gives me the following error: > > > > > > > > > > [cid:image002.png at 01D8EB93.7C17E410] > > > > > > > > > > I already installed OpenMPI using Cygwin installer but it > > > > > still asks me > > > > to. When I configure without ??download-mpich? and run ?make check? > > > > command, it gives me the following errors: > > > > > > > > > > [cid:image003.png at 01D8EB93.7C17E410] > > > > > > > > > > Could you kindly look into this and help me with this? Your > > > > > prompt > > > > response will highly be appreciated. > > > > > > > > > > The runs look fine. > > > > > > > > > > The test should not try to attach the debugger. Do you have > > > > > that in the > > > > PETSC_OPTIONS env variable? > > > > > > > > > > Thanks, > > > > > > > > > > Matt > > > > > > > > > > Thank you! > > > > > Mohammad Ali > > > > > Researcher, Sejong University > > > > > > > > > > > > > > > -- > > > > > What most experimenters take for granted before they begin > > > > > their > > > > experiments is infinitely more interesting than any results to > > > > which their experiments lead. > > > > > -- Norbert Wiener > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/< > > > > http://www.cse.buffalo.edu/~knepley/> > > > > > > > > > > > > > > > > > > > > > From knepley at gmail.com Mon Oct 31 22:15:38 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 31 Oct 2022 23:15:38 -0400 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2db12320-25ab-7911-4bb6-ff0195f5ffdc@mcs.anl.gov> <461d2b54-173d-95fa-6ad5-9ce81849871e@mcs.anl.gov> Message-ID: It is in the directory you executed configure in. Thanks, Matt On Mon, Oct 31, 2022 at 10:41 PM Mohammad Ali Yaqteen wrote: > From where can I get that? > > Ali > > -----Original Message----- > From: Satish Balay > Sent: Tuesday, November 1, 2022 11:36 AM > To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: RE: [petsc-users] PETSc Windows Installation > > you'll have to send configure.log for this failure > > Satish > > > On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > > > I have checked the required Cygwin openmpi libraries and they are all > installed. When I run ./configure --with-cc=mpicc --with-cxx=mpicxx > --with-fc=mpif90, it returns: > > > > $ ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > > > ============================================================================================= > > Configuring PETSc to compile on your system > > ====================================================================== > > ======================= > > TESTING: checkCCompiler from > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log > for details): > > ---------------------------------------------------------------------- > > --------- C compiler you provided with -with-cc=mpicc cannot be found > > or does not work. > > Cannot compile/link C with mpicc. > > > > As for the case of WSL2, I will try to install that on my PC. > > Meanwhile, could you please look into this issue > > > > Thank you > > > > Ali > > > > -----Original Message----- > > From: Satish Balay > > Sent: Monday, October 31, 2022 10:56 PM > > To: Satish Balay via petsc-users > > Cc: Matthew Knepley ; Mohammad Ali Yaqteen > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > BTW: If you have WSL2 on windows - it might be easier to build/use PETSc. > > > > Satish > > > > On Mon, 31 Oct 2022, Satish Balay via petsc-users wrote: > > > > > Make sure you have cygwin openmpi installed [and cywin blas/lapack] > > > > > > $ cygcheck -cd |grep openmpi > > > libopenmpi-devel 4.1.2-1 > > > libopenmpi40 4.1.2-1 > > > libopenmpifh40 4.1.2-1 > > > libopenmpiusef08_40 4.1.2-1 > > > libopenmpiusetkr40 4.1.2-1 > > > openmpi 4.1.2-1 > > > $ cygcheck -cd |grep lapack > > > liblapack-devel 3.10.1-1 > > > liblapack0 3.10.1-1 > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > --download-f2cblaslapack > > > > > > Should be: > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > i.e [default] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 [an > > > default cygwin blas/lapack] > > > > > > Satish > > > > > > > > > On Mon, 31 Oct 2022, Matthew Knepley wrote: > > > > > > > On Mon, Oct 31, 2022 at 1:56 AM Mohammad Ali Yaqteen > > > > > > > > wrote: > > > > > > > > > Dear Satish > > > > > > > > > > When I configure PETSc with (./configure --with-cc=gcc > > > > > --with-cxx=0 > > > > > --with-fc=0 --download-f2cblaslapack) it runs as I shared > > > > > initially which you said is not an issue anymore. But when I add > > > > > (--download-scalapack > > > > > --download-mumps) or configure with these later, it gives the > > > > > following > > > > > error: > > > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > > > > > > ============================================================================================= > > > > > Configuring PETSc to compile on your > > > > > system > > > > > > > > > > ================================================================ > > > > > == > > > > > =========================== > > > > > TESTING: FortranMPICheck from > > > > > > config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* > > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see > configure.log for > > > > > details): > > > > > > > > > > ---------------------------------------------------------------- > > > > > -- > > > > > ------------- Fortran error! mpi_init() could not be located! > > > > > > > > > > **************************************************************** > > > > > ** > > > > > ************* > > > > > > > > > > What could be the problem here? > > > > > > > > > > > > > Without configure.log we cannot tell what went wrong. However, > > > > from the error message, I would guess that your MPI was not built > > > > with Fortran bindings. You need these for those packages. > > > > > > > > Thanks, > > > > > > > > Matt > > > > > > > > > > > > > Your help is highly appreciated. > > > > > > > > > > Thank you > > > > > Ali > > > > > > > > > > -----Original Message----- > > > > > From: Satish Balay > > > > > Sent: Saturday, October 29, 2022 2:11 PM > > > > > To: Mohammad Ali Yaqteen > > > > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > > > > > > > > > > > I haven?t accessed PETSC or given any command of my own. I was > > > > > > just > > > > > installing by following the instructions. I don?t know why it is > > > > > attaching the debugger. Although it says ?Possible error running > > > > > C/C++ > > > > > src/snes/tutorials/ex19 with 1 MPI process? which I think is > > > > > indicating of missing of MPI! > > > > > > > > > > The diff is not smart enough to detect the extra message from > > > > > cygwin/OpenMPI - hence it assumes there is a potential problem - > > > > > and prints the above message. > > > > > > > > > > But you can assume its installed properly - and use it. > > > > > > > > > > Satish > > > > > > > > > > > > From: Matthew Knepley > > > > > > Sent: Friday, October 28, 2022 10:31 PM > > > > > > To: Mohammad Ali Yaqteen > > > > > > Cc: petsc-users at mcs.anl.gov > > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen < > > > > > mhyaqteen at sju.ac.kr> wrote: > > > > > > Dear Sir, > > > > > > > > > > > > During the Installation of PETSc in windows, I installed > > > > > > Cygwin and the > > > > > required libraries as mentioned on your website: > > > > > > [cid:image001.png at 01D8EB93.7C17E410] > > > > > > However, when I install PETSc using the configure commands > > > > > > present on > > > > > the petsc website: > > > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > > --download-f2cblaslapack --download-mpich > > > > > > > > > > > > it gives me the following error: > > > > > > > > > > > > [cid:image002.png at 01D8EB93.7C17E410] > > > > > > > > > > > > I already installed OpenMPI using Cygwin installer but it > > > > > > still asks me > > > > > to. When I configure without ??download-mpich? and run ?make check? > > > > > command, it gives me the following errors: > > > > > > > > > > > > [cid:image003.png at 01D8EB93.7C17E410] > > > > > > > > > > > > Could you kindly look into this and help me with this? Your > > > > > > prompt > > > > > response will highly be appreciated. > > > > > > > > > > > > The runs look fine. > > > > > > > > > > > > The test should not try to attach the debugger. Do you have > > > > > > that in the > > > > > PETSC_OPTIONS env variable? > > > > > > > > > > > > Thanks, > > > > > > > > > > > > Matt > > > > > > > > > > > > Thank you! > > > > > > Mohammad Ali > > > > > > Researcher, Sejong University > > > > > > > > > > > > > > > > > > -- > > > > > > What most experimenters take for granted before they begin > > > > > > their > > > > > experiments is infinitely more interesting than any results to > > > > > which their experiments lead. > > > > > > -- Norbert Wiener > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/< > > > > > http://www.cse.buffalo.edu/~knepley/> > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mhyaqteen at sju.ac.kr Mon Oct 31 23:16:20 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Tue, 1 Nov 2022 04:16:20 +0000 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: <461d2b54-173d-95fa-6ad5-9ce81849871e@mcs.anl.gov> References: <2db12320-25ab-7911-4bb6-ff0195f5ffdc@mcs.anl.gov> <461d2b54-173d-95fa-6ad5-9ce81849871e@mcs.anl.gov> Message-ID: I am unable to attach the configure.log file. Hence. I have copied the following text after executing the command (less configure.log) in the cygwin64 Executing: uname -s stdout: CYGWIN_NT-10.0-19044 ============================================================================================= Configuring PETSc to compile on your system ============================================================================================= ================================================================================ ================================================================================ Starting configure run at Tue, 01 Nov 2022 13:06:06 +0900 Configure Options: --configModules=PETSc.Configure --optionsModule=config.compilerOptions --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 Working directory: /home/SEJONG/petsc-3.18.1 Machine platform: uname_result(system='CYGWIN_NT-10.0-19044', node='DESKTOP-R1C768B', release='3.3.6-341.x86_64', version='2022-09-05 11:15 UTC', machine='x86_64') Python version: 3.9.10 (main, Jan 20 2022, 21:37:52) [GCC 11.2.0] ================================================================================ Environmental variables USERDOMAIN=DESKTOP-R1C768B OS=Windows_NT COMMONPROGRAMFILES=C:\Program Files\Common Files PROCESSOR_LEVEL=6 PSModulePath=C:\Users\SEJONG\Documents\WindowsPowerShell\Modules;C:\Program Files\WindowsPowerShell\Modules;C:\Windows\system32\WindowsPowerShell\v1.0\Modules CommonProgramW6432=C:\Program Files\Common Files CommonProgramFiles(x86)=C:\Program Files (x86)\Common Files LANG=en_US.UTF-8 TZ=Asia/Seoul HOSTNAME=DESKTOP-R1C768B PUBLIC=C:\Users\Public OLDPWD=/home/SEJONG USERNAME=SEJONG LOGONSERVER=\\DESKTOP-R1C768B PROCESSOR_ARCHITECTURE=AMD64 LOCALAPPDATA=C:\Users\SEJONG\AppData\Local COMPUTERNAME=DESKTOP-R1C768B USER=SEJONG !::=::\ SYSTEMDRIVE=C: USERPROFILE=C:\Users\SEJONG PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.CPL SYSTEMROOT=C:\Windows USERDOMAIN_ROAMINGPROFILE=DESKTOP-R1C768B OneDriveCommercial=C:\Users\SEJONG\OneDrive - Sejong University PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 165 Stepping 5, GenuineIntel GNUPLOT_LIB=C:\Program Files\gnuplot\demo;C:\Program Files\gnuplot\demo\games;C:\Program Files\gnuplot\share PWD=/home/SEJONG/petsc-3.18.1 MSMPI_BIN=C:\Program Files\Microsoft MPI\Bin\ HOME=/home/SEJONG TMP=/tmp OneDrive=C:\Users\SEJONG\OneDrive - Sejong University ZES_ENABLE_SYSMAN=1 !C:=C:\cygwin64\bin PROCESSOR_REVISION=a505 PROFILEREAD=true PROMPT=$P$G NUMBER_OF_PROCESSORS=16 ProgramW6432=C:\Program Files COMSPEC=C:\Windows\system32\cmd.exe APPDATA=C:\Users\SEJONG\AppData\Roaming SHELL=/bin/bash TERM=xterm-256color WINDIR=C:\Windows ProgramData=C:\ProgramData SHLVL=1 PRINTER=\\210.107.220.119\HP Color LaserJet Pro MFP M377 PCL 6 PROGRAMFILES=C:\Program Files ALLUSERSPROFILE=C:\ProgramData TEMP=/tmp DriverData=C:\Windows\System32\Drivers\DriverData SESSIONNAME=Console ProgramFiles(x86)=C:\Program Files (x86) PATH=/usr/local/bin:/usr/bin:/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program Files/Microsoft MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools:/usr/lib/lapack PS1=\[\e]0;\w\a\]\n\[\e[32m\]\u@\h \[\e[33m\]\w\[\e[0m\]\n\$ HOMEDRIVE=C: INFOPATH=/usr/local/info:/usr/share/info:/usr/info HOMEPATH=\Users\SEJONG ORIGINAL_PATH=/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program Files/Microsoft MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools EXECIGNORE=*.dll _=./configure Files in path provided by default path /usr/local/bin: /usr/bin: addftinfo.exe addr2line.exe apropos ar.exe arch.exe as.exe ash.exe awk b2sum.exe base32.exe base64.exe basename.exe basenc.exe bash.exe bashbug bomtool.exe bunzip2.exe bzcat.exe bzcmp bzdiff bzegrep bzfgrep bzgrep bzip2.exe bzip2recover.exe bzless bzmore c++.exe c++filt.exe c89 c99 ca-legacy cal.exe captoinfo cat.exe catman.exe cc ccmake.exe chattr.exe chcon.exe chgrp.exe chmod.exe chown.exe chroot.exe chrt.exe cksum.exe clear.exe cmake.exe cmp.exe col.exe colcrt.exe colrm.exe column.exe comm.exe cp.exe cpack.exe cpp.exe csplit.exe ctest.exe cut.exe cygarchive-13.dll cygargp-0.dll cygatomic-1.dll cygattr-1.dll cygblkid-1.dll cygbrotlicommon-1.dll cygbrotlidec-1.dll cygbz2-1.dll cygcheck.exe cygcom_err-2.dll cygcrypt-2.dll cygcrypto-1.1.dll cygcurl-4.dll cygdb-5.3.dll cygdb_cxx-5.3.dll cygdb_sql-5.3.dll cygedit-0.dll cygevent-2-1-7.dll cygevent_core-2-1-7.dll cygevent_extra-2-1-7.dll cygevent_openssl-2-1-7.dll cygevent_pthreads-2-1-7.dll cygexpat-1.dll cygfdisk-1.dll cygffi-6.dll cygfido2-1.dll cygformw-10.dll cyggc-1.dll cyggcc_s-seh-1.dll cyggdbm-6.dll cyggdbm_compat-4.dll cyggfortran-5.dll cyggmp-10.dll cyggomp-1.dll cyggsasl-7.dll cyggssapi_krb5-2.dll cygguile-2.2-1.dll cyghistory7.dll cyghwloc-15.dll cygiconv-2.dll cygidn-12.dll cygidn2-0.dll cygintl-8.dll cygisl-23.dll cygjsoncpp-25.dll cygk5crypto-3.dll cygkrb5-3.dll cygkrb5support-0.dll cyglber-2-4-2.dll cyglber-2.dll cygldap-2-4-2.dll cygldap-2.dll cygldap_r-2-4-2.dll cygltdl-7.dll cyglz4-1.dll cyglzma-5.dll cyglzo2-2.dll cygmagic-1.dll cygman-2-11-0.dll cygmandb-2-11-0.dll cygmenuw-10.dll cygmpc-3.dll cygmpfr-6.dll cygmpi-40.dll cygmpi_mpifh-40.dll cygmpi_usempif08-40.dll cygmpi_usempi_ignore_tkr-40.dll cygncursesw-10.dll cygnghttp2-14.dll cygntlm-0.dll cygopen-pal-40.dll cygopen-rte-40.dll cygp11-kit-0.dll cygpanelw-10.dll cygpath.exe cygpcre2-8-0.dll cygperl5_32.dll cygpipeline-1.dll cygpkgconf-4.dll cygpopt-0.dll cygpsl-5.dll cygquadmath-0.dll cygreadline7.dll cygrhash-0.dll cygrunsrv.exe cygsasl2-3.dll cygserver-config cygsigsegv-2.dll cygsmartcols-1.dll cygsqlite3-0.dll cygssh2-1.dll cygssl-1.1.dll cygstart.exe cygstdc++-6.dll cygtasn1-6.dll cygticw-10.dll cygunistring-2.dll cyguuid-1.dll cyguv-1.dll cygwin-console-helper.exe cygwin1.dll cygxml2-2.dll cygxxhash-0.dll cygz.dll cygzstd-1.dll dash.exe date.exe dd.exe df.exe diff.exe diff3.exe dir.exe dircolors.exe dirname.exe dlltool.exe dllwrap.exe dnsdomainname domainname du.exe dumper.exe echo.exe editrights.exe egrep elfedit.exe env.exe eqn.exe eqn2graph ex expand.exe expr.exe f95 factor.exe false.exe fgrep fido2-assert.exe fido2-cred.exe fido2-token.exe file.exe find.exe flock.exe fmt.exe fold.exe g++.exe gawk-5.1.1.exe gawk.exe gcc-ar.exe gcc-nm.exe gcc-ranlib.exe gcc.exe gcov-dump.exe gcov-tool.exe gcov.exe gdiffmk gencat.exe getconf.exe getent.exe getfacl.exe getopt.exe gfortran.exe git-receive-pack.exe git-shell.exe git-upload-archive.exe git-upload-pack.exe git.exe gkill.exe gmondump.exe gprof.exe grap2graph grep.exe grn.exe grodvi.exe groff.exe grolbp.exe grolj4.exe grops.exe grotty.exe groups.exe gunzip gzexe gzip.exe head.exe hexdump.exe hostid.exe hostname.exe hpftodit.exe i686-w64-mingw32-pkg-config id.exe indxbib.exe info.exe infocmp.exe infotocap install-info.exe install.exe ipcmk.exe ipcrm.exe ipcs.exe isosize.exe join.exe kill.exe lastlog.exe ld.bfd.exe ld.exe ldd.exe ldh.exe less.exe lessecho.exe lesskey.exe lexgrog.exe libpython3.9.dll link-cygin.exe lkbib.exe ln.exe locale.exe locate.exe logger.exe login.exe logname.exe look.exe lookbib.exe ls.exe lsattr.exe lto-dump.exe lzcat lzcmp lzdiff lzegrep lzfgrep lzgrep lzless lzma lzmadec.exe lzmainfo.exe lzmore make-dummy-cert make.exe man-recode.exe man.exe mandb.exe manpath.exe mcookie.exe md5sum.exe minidumper.exe mintheme mintty.exe mkdir.exe mkfifo.exe mkgroup.exe mknod.exe mkpasswd.exe mkshortcut.exe mktemp.exe more.exe mount.exe mpic++ mpicc mpicxx mpiexec mpif77 mpif90 mpifort mpirun mv.exe namei.exe neqn nice.exe nl.exe nm.exe nohup.exe nproc.exe nroff numfmt.exe objcopy.exe objdump.exe od.exe ompi-clean ompi-server ompi_info.exe opal_wrapper.exe openssl.exe orte-clean.exe orte-info.exe orte-server.exe ortecc orted.exe orterun.exe p11-kit.exe passwd.exe paste.exe pathchk.exe pdfroff peflags.exe peflagsall perl.exe perl5.32.1.exe pfbtops.exe pg.exe pic.exe pic2graph pinky.exe pip3 pip3.9 pkg-config pkgconf.exe pldd.exe post-grohtml.exe pr.exe pre-grohtml.exe preconv.exe printenv.exe printf.exe profiler.exe ps.exe ptx.exe pwd.exe pydoc3 pydoc3.9 python python3 python3.9.exe pzstd.exe ranlib.exe readelf.exe readlink.exe readshortcut.exe realpath.exe rebase-trigger rebase.exe rebaseall rebaselst refer.exe regtool.exe rename.exe renew-dummy-cert renice.exe reset rev.exe rm.exe rmdir.exe rsync-ssl rsync.exe run.exe runcon.exe rvi rview scalar.exe scp.exe script.exe scriptreplay.exe sdiff.exe sed.exe seq.exe setfacl.exe setmetamode.exe setsid.exe sftp.exe sh.exe sha1sum.exe sha224sum.exe sha256sum.exe sha384sum.exe sha512sum.exe shred.exe shuf.exe size.exe sleep.exe slogin soelim.exe sort.exe split.exe ssh-add.exe ssh-agent.exe ssh-copy-id ssh-host-config ssh-keygen.exe ssh-keyscan.exe ssh-user-config ssh.exe ssp.exe stat.exe stdbuf.exe strace.exe strings.exe strip.exe stty.exe sum.exe sync.exe tabs.exe tac.exe tail.exe tar.exe taskset.exe tbl.exe tee.exe test.exe tfmtodit.exe tic.exe timeout.exe toe.exe touch.exe tput.exe tr.exe troff.exe true.exe truncate.exe trust.exe tset.exe tsort.exe tty.exe tzselect tzset.exe ul.exe umount.exe uname.exe unexpand.exe uniq.exe unlink.exe unlzma unxz unzstd update-ca-trust update-crypto-policies updatedb users.exe uuidgen.exe uuidparse.exe vdir.exe vi.exe view wc.exe whatis.exe whereis.exe which.exe who.exe whoami.exe windmc.exe windres.exe x86_64-pc-cygwin-c++.exe x86_64-pc-cygwin-g++.exe x86_64-pc-cygwin-gcc-11.exe x86_64-pc-cygwin-gcc-ar.exe x86_64-pc-cygwin-gcc-nm.exe x86_64-pc-cygwin-gcc-ranlib.exe x86_64-pc-cygwin-gcc.exe x86_64-pc-cygwin-gfortran.exe x86_64-pc-cygwin-pkg-config x86_64-w64-mingw32-pkg-config xargs.exe xmlcatalog.exe xmllint.exe xz.exe xzcat xzcmp xzdec.exe xzdiff xzegrep xzfgrep xzgrep xzless xzmore yes.exe zcat zcmp zdiff zdump.exe zegrep zfgrep zforce zgrep zless zmore znew zstd.exe zstdcat zstdgrep zstdless zstdmt [.exe /cygdrive/c/SIMULIA/Commands: abaqus.bat abq2018.bat abq_cae_open.bat abq_odb_open.bat /cygdrive/c/Program Files/Microsoft MPI/Bin: mpiexec.exe mpitrace.man smpd.exe provthrd.dll provtool.exe ProximityCommon.dll ProximityCommonPal.dll ProximityRtapiPal.dll ProximityService.dll ProximityServicePal.dll ProximityToast ProximityUxHost.exe prproc.exe prvdmofcomp.dll psapi.dll pscript.sep PSHED.DLL psisdecd.dll psisrndr.ax PSModuleDis coveryProvider.dll psmodulediscoveryprovider.mof PsmServiceExtHost.dll psmsrv.dll psr.exe pstask.dll pstorec.dll pt-BR pt-PT ptpprov.dll puiapi.dll puiobj.dll PushToInstall.dll pwlauncher.dll pwlauncher.exe pwrshplugin.dll pwsso.dll qappsrv.exe qasf.dll qcap.dll qdv. dll qdvd.dll qedit.dll qedwipes.dll qmgr.dll qprocess.exe QualityUpdateAssistant.dll quartz.dll Query.dll query.exe QuickActionsDataModel.dll quickassist.exe QuietHours.dll quser.exe qwave.dll qwinsta.exe RacEngn.dll racpldlg.dll radardt.dll radarrs.dll RADCUI.dll ra s rasadhlp.dll rasapi32.dll rasauto.dll rasautou.exe raschap.dll raschapext.dll rasctrnm.h rasctrs.dll rascustom.dll rasdiag.dll rasdial.exe rasdlg.dll raserver.exe rasgcw.dll rasman.dll rasmans.dll rasmbmgr.dll RasMediaManager.dll RASMM.dll rasmontr.dll rasphone.exe rasplap.dll rasppp.dll rastapi.dll rastls.dll rastlsext.dll RasToast rdbui.dll rdpbase.dll rdpcfgex.dll rdpclip.exe rdpcore.dll rdpcorets.dll rdpcredentialprovider.dll rdpencom.dll rdpendp.dll rdpinit.exe rdpinput.exe rdpnano.dll RdpRelayTransport.dll RdpSa.exe RdpS aProxy.exe RdpSaPs.dll RdpSaUacHelper.exe rdpserverbase.dll rdpsharercom.dll rdpshell.exe rdpsign.exe rdpudd.dll rdpviewerax.dll rdrleakdiag.exe RDSAppXHelper.dll rdsdwmdr.dll rdsxvmaudio.dll rdvvmtransport.dll RDXService.dll RDXTaskFactory.dll ReAgent.dll ReAgentc.e xe ReAgentTask.dll recdisc.exe recover.exe Recovery recovery.dll RecoveryDrive.exe refsutil.exe reg.exe regapi.dll RegCtrl.dll regedt32.exe regidle.dll regini.exe Register-CimProvider.exe regsvc.dll regsvr32.exe reguwpapi.dll ReInfo.dll rekeywiz.exe relog.exe RelPost .exe RemoteAppLifetimeManager.exe RemoteAppLifetimeManagerProxyStub.dll remoteaudioendpoint.dll remotepg.dll RemotePosWorker.exe remotesp.tsp RemoteSystemToastIcon.contrast-white.png RemoteSystemToastIcon.png RemoteWipeCSP.dll RemovableMediaProvisioningPlugin.dll Rem oveDeviceContextHandler.dll RemoveDeviceElevated.dll rendezvousSession.tlb repair-bde.exe replace.exe ReportingCSP.dll RESAMPLEDMO.DLL ResBParser.dll reset.exe reseteng.dll ResetEngine.dll ResetEngine.exe ResetEngOnline.dll resmon.exe ResourceMapper.dll ResourcePolic yClient.dll ResourcePolicyServer.dll ResPriHMImageList ResPriHMImageListLowCost ResPriImageList ResPriImageListLowCost RestartManager.mof RestartManagerUninstall.mof RestartNowPower_80.contrast-black.png RestartNowPower_80.contrast-white.png RestartNowPower_80.png Re startTonight_80.png RestartTonight_80_contrast-black.png RestartTonight_80_contrast-white.png restore resutils.dll rgb9rast.dll Ribbons.scr riched20.dll riched32.dll rilproxy.dll RjvMDMConfig.dll RMActivate.exe RMActivate_isv.exe RMActivate_ssp.exe RMActivate_ssp_isv .exe RMapi.dll rmclient.dll RmClient.exe RMSRoamingSecurity.dll rmttpmvscmgrsvr.exe rnr20.dll ro-RO RoamingSecurity.dll Robocopy.exe rometadata.dll RotMgr.dll ROUTE.EXE RpcEpMap.dll rpchttp.dll RpcNs4.dll rpcnsh.dll RpcPing.exe rpcrt4.dll RpcRtRemote.dll rpcss.dll rr installer.exe rsaenh.dll rshx32.dll rsop.msc RstMwEventLogMsg.dll RstrtMgr.dll rstrui.exe RtCOM64.dll RtDataProc64.dll rtffilt.dll RtkApi64U.dll RtkAudUService64.exe RtkCfg64.dll rtm.dll rtmcodecs.dll RTMediaFrame.dll rtmmvrortc.dll rtmpal.dll rtmpltfm.dll rtutils.dl l RTWorkQ.dll ru-RU RuleBasedDS.dll runas.exe rundll32.exe runexehelper.exe RunLegacyCPLElevated.exe runonce.exe RuntimeBroker.exe rwinsta.exe samcli.dll samlib.dll samsrv.dll Samsung sas.dll sbe.dll sbeio.dll sberes.dll sbservicetrigger.dll sc.exe ScanPlugin.dll sca nsetting.dll SCardBi.dll SCardDlg.dll SCardSvr.dll ScavengeSpace.xml scavengeui.dll ScDeviceEnum.dll scecli.dll scesrv.dll schannel.dll schedcli.dll schedsvc.dll ScheduleTime_80.contrast-black.png ScheduleTime_80.contrast-white.png ScheduleTime_80.png schtasks.exe sc ksp.dll scripto.dll ScriptRunner.exe scrnsave.scr scrobj.dll scrptadm.dll scrrun.dll sdbinst.exe sdchange.exe sdclt.exe sdcpl.dll SDDS.dll sdengin2.dll SDFHost.dll sdhcinst.dll sdiageng.dll sdiagnhost.exe sdiagprv.dll sdiagschd.dll sdohlp.dll sdrsvc.dll sdshext.dll S earch.ProtocolHandler.MAPI2.dll SearchFilterHost.exe SearchFolder.dll SearchIndexer.exe SearchProtocolHost.exe SebBackgroundManagerPolicy.dll SecConfig.efi SecEdit.exe sechost.dll secinit.exe seclogon.dll secpol.msc secproc.dll secproc_isv.dll secproc_ssp.dll secproc _ssp_isv.dll secur32.dll SecureAssessmentHandlers.dll SecureBootUpdates securekernel.exe SecureTimeAggregator.dll security.dll SecurityAndMaintenance.png SecurityAndMaintenance_Alert.png SecurityAndMaintenance_Error.png SecurityCenterBroker.dll SecurityCenterBrokerPS .dll SecurityHealthAgent.dll SecurityHealthHost.exe SecurityHealthProxyStub.dll SecurityHealthService.exe SecurityHealthSSO.dll SecurityHealthSystray.exe sedplugins.dll SEMgrPS.dll SEMgrSvc.dll sendmail.dll Sens.dll SensApi.dll SensorDataService.exe SensorPerformance Events.dll SensorsApi.dll SensorsClassExtension.dll SensorsCpl.dll SensorService.dll SensorsNativeApi.dll SensorsNativeApi.V2.dll SensorsUtilsV2.dll sensrsvc.dll serialui.dll services.exe services.msc ServicingUAPI.dll serwvdrv.dll SessEnv.dll sessionmsg.exe setbcdlo cale.dll sethc.exe SetNetworkLocation.dll SetNetworkLocationFlyout.dll SetProxyCredential.dll setspn.exe SettingMonitor.dll settings.dat SettingsEnvironment.Desktop.dll SettingsExtensibilityHandlers.dll SettingsHandlers_Accessibility.dll SettingsHandlers_AnalogShell. dll SettingsHandlers_AppControl.dll SettingsHandlers_AppExecutionAlias.dll SettingsHandlers_AssignedAccess.dll SettingsHandlers_Authentication.dll SettingsHandlers_BackgroundApps.dll SettingsHandlers_BatteryUsage.dll SettingsHandlers_BrowserDeclutter.dll SettingsHand lers_CapabilityAccess.dll SettingsHandlers_Clipboard.dll SettingsHandlers_ClosedCaptioning.dll SettingsHandlers_ContentDeliveryManager.dll SettingsHandlers_Cortana.dll SettingsHandlers_Devices.dll SettingsHandlers_Display.dll SettingsHandlers_Flights.dll SettingsHand lers_Fonts.dll SettingsHandlers_ForceSync.dll SettingsHandlers_Gaming.dll SettingsHandlers_Geolocation.dll SettingsHandlers_Gpu.dll SettingsHandlers_HoloLens_Environment.dll SettingsHandlers_IME.dll SettingsHandlers_InkingTypingPrivacy.dll SettingsHandlers_InputPerso nalization.dll SettingsHandlers_Language.dll SettingsHandlers_ManagePhone.dll SettingsHandlers_Maps.dll SettingsHandlers_Mouse.dll SettingsHandlers_Notifications.dll SettingsHandlers_nt.dll SettingsHandlers_OneCore_BatterySaver.dll SettingsHandlers_OneCore_PowerAndSl eep.dll SettingsHandlers_OneDriveBackup.dll SettingsHandlers_OptionalFeatures.dll SettingsHandlers_PCDisplay.dll SettingsHandlers_Pen.dll SettingsHandlers_QuickActions.dll SettingsHandlers_Region.dll SettingsHandlers_SharedExperiences_Rome.dll SettingsHandlers_SIUF.d ll SettingsHandlers_SpeechPrivacy.dll SettingsHandlers_Startup.dll SettingsHandlers_StorageSense.dll SettingsHandlers_Troubleshoot.dll SettingsHandlers_User.dll SettingsHandlers_UserAccount.dll SettingsHandlers_UserExperience.dll SettingsHandlers_WorkAccess.dll Setti ngSync.dll SettingSyncCore.dll SettingSyncDownloadHelper.dll SettingSyncHost.exe setup setupapi.dll setupcl.dll setupcl.exe setupcln.dll setupetw.dll setupugc.exe setx.exe sfc.dll sfc.exe sfc_os.dll Sgrm SgrmBroker.exe SgrmEnclave.dll SgrmEnclave_secure.dll SgrmLpac. exe shacct.dll shacctprofile.dll SharedPCCSP.dll SharedRealitySvc.dll ShareHost.dll sharemediacpl.dll SHCore.dll shdocvw.dll shell32.dll ShellAppRuntime.exe ShellCommonCommonProxyStub.dll ShellExperiences shellstyle.dll shfolder.dll shgina.dll ShiftJIS.uce shimeng.dl l shimgvw.dll shlwapi.dll shpafact.dll shrpubw.exe shsetup.dll shsvcs.dll shunimpl.dll shutdown.exe shutdownext.dll shutdownux.dll shwebsvc.dll si-lk signdrv.dll sigverif.exe SIHClient.exe sihost.exe SimAuth.dll SimCfg.dll simpdata.tlb sk-SK skci.dll sl-SI slc.dll sl cext.dll SleepStudy SlideToShutDown.exe slmgr slmgr.vbs slui.exe slwga.dll SmallRoom.bin SmartCardBackgroundPolicy.dll SmartcardCredentialProvider.dll SmartCardSimulator.dll smartscreen.exe smartscreenps.dll SMBHelperClass.dll smbwmiv2.dll SMI SmiEngine.dll smphost.d ll SmsRouterSvc.dll smss.exe SndVol.exe SndVolSSO.dll SnippingTool.exe snmpapi.dll snmptrap.exe Snooze_80.contrast-black.png Snooze_80.contrast-white.png Snooze_80.png socialapis.dll softkbd.dll softpub.dll sort.exe SortServer2003Compat.dll SortWindows61.dll SortWind ows62.dll SortWindows64.dll SortWindows6Compat.dll SpaceAgent.exe spacebridge.dll SpaceControl.dll spaceman.exe SpatialAudioLicenseSrv.exe SpatializerApo.dll SpatialStore.dll spbcd.dll SpeakersSystemToastIcon.contrast-white.png SpeakersSystemToastIcon.png Spectrum.ex e SpectrumSyncClient.dll Speech SpeechPal.dll Speech_OneCore spfileq.dll spinf.dll spmpm.dll spnet.dll spool spoolss.dll spoolsv.exe spopk.dll spp spp.dll sppc.dll sppcext.dll sppcomapi.dll sppcommdlg.dll SppExtComObj.Exe sppinst.dll sppnp.dll sppobjs.dll sppsvc.exe sppui sppwinob.dll sppwmi.dll spwinsat.dll spwizeng.dll spwizimg.dll spwizres.dll spwmp.dll SqlServerSpatial130.dll SqlServerSpatial150.dll sqlsrv32.dll sqlsrv32.rll sqmapi.dll sr-Latn-RS srchadmin.dll srclient.dll srcore.dll srdelayed.exe SrEvents.dll SRH.dll srhelp er.dll srm.dll srmclient.dll srmlib.dll srms-apr-v.dat srms-apr.dat srms.dat srmscan.dll srmshell.dll srmstormod.dll srmtrace.dll srm_ps.dll srpapi.dll SrpUxNativeSnapIn.dll srrstr.dll SrTasks.exe sru srumapi.dll srumsvc.dll srvcli.dll srvsvc.dll srwmi.dll sscore.dll sscoreext.dll ssdm.dll ssdpapi.dll ssdpsrv.dll sspicli.dll sspisrv.dll SSShim.dll ssText3d.scr sstpsvc.dll StartTileData.dll Startupscan.dll StateRepository.Core.dll stclient.dll stdole2.tlb stdole32.tlb sti.dll sti_ci.dll stobject.dll StorageContextHandler.dll Stor ageUsage.dll storagewmi.dll storagewmi_passthru.dll stordiag.exe storewuauth.dll Storprop.dll StorSvc.dll streamci.dll StringFeedbackEngine.dll StructuredQuery.dll SubRange.uce subst.exe sud.dll sv-SE SvBannerBackground.png svchost.exe svf.dll svsvc.dll SwitcherDataM odel.dll swprv.dll sxproxy.dll sxs.dll sxshared.dll sxssrv.dll sxsstore.dll sxstrace.exe SyncAppvPublishingServer.exe SyncAppvPublishingServer.vbs SyncCenter.dll SyncController.dll SyncHost.exe SyncHostps.dll SyncInfrastructure.dll SyncInfrastructureps.dll SyncProxy. dll Syncreg.dll SyncRes.dll SyncSettings.dll syncutil.dll sysclass.dll sysdm.cpl SysFxUI.dll sysmain.dll sysmon.ocx sysntfy.dll Sysprep sysprint.sep sysprtj.sep SysResetErr.exe syssetup.dll systemcpl.dll SystemEventsBrokerClient.dll SystemEventsBrokerServer.dll syste minfo.exe SystemPropertiesAdvanced.exe SystemPropertiesComputerName.exe SystemPropertiesDataExecutionPrevention.exe SystemPropertiesHardware.exe SystemPropertiesPerformance.exe SystemPropertiesProtection.exe SystemPropertiesRemote.exe systemreset.exe SystemResetPlatf orm SystemSettings.DataModel.dll SystemSettings.DeviceEncryptionHandlers.dll SystemSettings.Handlers.dll SystemSettings.SettingsExtensibility.dll SystemSettings.UserAccountsHandlers.dll SystemSettingsAdminFlows.exe SystemSettingsBroker.exe SystemSettingsRemoveDevice. exe SystemSettingsThresholdAdminFlowUI.dll SystemSupportInfo.dll SystemUWPLauncher.exe systray.exe t2embed.dll ta-in ta-lk Tabbtn.dll TabbtnEx.dll tabcal.exe TabletPC.cpl TabSvc.dll takeown.exe tapi3.dll tapi32.dll tapilua.dll TapiMigPlugin.dll tapiperf.dll tapisrv.d ll TapiSysprep.dll tapiui.dll TapiUnattend.exe tar.exe TaskApis.dll taskbarcpl.dll taskcomp.dll TaskFlowDataEngine.dll taskhostw.exe taskkill.exe tasklist.exe Taskmgr.exe Tasks taskschd.dll taskschd.msc TaskSchdPS.dll tbauth.dll tbs.dll tcblaunch.exe tcbloader.dll tc msetup.exe tcpbidi.xml tcpipcfg.dll tcpmib.dll tcpmon.dll tcpmon.ini tcpmonui.dll TCPSVCS.EXE tdc.ocx tdh.dll TDLMigration.dll TEEManagement64.dll telephon.cpl TelephonyInteractiveUser.dll TelephonyInteractiveUserRes.dll tellib.dll TempSignedLicenseExchangeTask.dll T enantRestrictionsPlugin.dll termmgr.dll termsrv.dll tetheringclient.dll tetheringconfigsp.dll TetheringIeProvider.dll TetheringMgr.dll tetheringservice.dll TetheringStation.dll TextInputFramework.dll TextInputMethodFormatter.dll TextShaping.dll th-TH themecpl.dll The mes.SsfDownload.ScheduledTask.dll themeservice.dll themeui.dll ThirdPartyNoticesBySHS.txt threadpoolwinrt.dll thumbcache.dll ThumbnailExtractionHost.exe ti-et tier2punctuations.dll TieringEngineProxy.dll TieringEngineService.exe TileDataRepository.dll TimeBrokerClien t.dll TimeBrokerServer.dll timedate.cpl TimeDateMUICallback.dll timeout.exe timesync.dll TimeSyncTask.dll TKCtrl2k64.sys TKFsAv64.sys TKFsFt64.sys TKFWFV.inf TKFWFV64.cat TKFWFV64.sys tkfwvt64.sys TKIdsVt64.sys TKPcFtCb64.sys TKPcFtCb64.sys_ TKPcFtHk64.sys TKRgAc2k64 .sys TKRgFtXp64.sys TKTool2k.sys TKTool2k64.sys tlscsp.dll tokenbinding.dll TokenBroker.dll TokenBrokerCookies.exe TokenBrokerUI.dll tpm.msc TpmCertResources.dll tpmcompc.dll TpmCoreProvisioning.dll TpmInit.exe TpmTasks.dll TpmTool.exe tpmvsc.dll tpmvscmgr.exe tpmvsc mgrsvr.exe tquery.dll tr-TR tracerpt.exe TRACERT.EXE traffic.dll TransformPPSToWlan.xslt TransformPPSToWlanCredentials.xslt TransliterationRanker.dll TransportDSA.dll tree.com trie.dll trkwks.dll TrustedSignalCredProv.dll tsbyuv.dll tscfgwmi.dll tscon.exe tsdiscon.ex e TSErrRedir.dll tsf3gip.dll tsgqec.dll tskill.exe tsmf.dll TSpkg.dll tspubwmi.dll TSSessionUX.dll tssrvlic.dll TSTheme.exe TsUsbGDCoInstaller.dll TsUsbRedirectionGroupPolicyExtension.dll TSWbPrxy.exe TSWorkspace.dll TsWpfWrp.exe ttdinject.exe ttdloader.dll ttdplm.dl l ttdrecord.dll ttdrecordcpu.dll TtlsAuth.dll TtlsCfg.dll TtlsExt.dll tttracer.exe tvratings.dll twext.dll twinapi.appcore.dll twinapi.dll twinui.appcore.dll twinui.dll twinui.pcshell.dll txflog.dll txfw32.dll typeperf.exe tzautoupdate.dll tzres.dll tzsync.exe tzsync res.dll tzutil.exe ubpm.dll ucmhc.dll ucrtbase.dll ucrtbased.dll ucrtbase_clr0400.dll ucrtbase_enclave.dll ucsvc.exe udhisapi.dll uDWM.dll UefiCsp.dll UevAgentPolicyGenerator.exe UevAppMonitor.exe UevAppMonitor.exe.config UevCustomActionTypes.tlb UevTemplateBaselineG enerator.exe UevTemplateConfigItemGenerator.exe uexfat.dll ufat.dll UiaManager.dll UIAnimation.dll UIAutomationCore.dll uicom.dll UIManagerBrokerps.dll UIMgrBroker.exe uireng.dll UIRibbon.dll UIRibbonRes.dll uk-UA ulib.dll umb.dll umdmxfrm.dll umpdc.dll umpnpmgr.dll umpo-overrides.dll umpo.dll umpoext.dll umpowmi.dll umrdp.dll unattend.dll unenrollhook.dll unimdm.tsp unimdmat.dll uniplat.dll Unistore.dll unlodctr.exe UNP unregmp2.exe untfs.dll UpdateAgent.dll updatecsp.dll UpdateDeploymentProvider.dll UpdateHeartbeat.dll updatep olicy.dll upfc.exe UpgradeResultsUI.exe upnp.dll upnpcont.exe upnphost.dll UPPrinterInstaller.exe UPPrinterInstallsCSP.dll upshared.dll uReFS.dll uReFSv1.dll ureg.dll url.dll urlmon.dll UsbCApi.dll usbceip.dll usbmon.dll usbperf.dll UsbPmApi.dll UsbSettingsHandlers.d ll UsbTask.dll usbui.dll user32.dll UserAccountBroker.exe UserAccountControlSettings.dll UserAccountControlSettings.exe useractivitybroker.dll usercpl.dll UserDataAccessRes.dll UserDataAccountApis.dll UserDataLanguageUtil.dll UserDataPlatformHelperUtil.dll UserDataSe rvice.dll UserDataTimeUtil.dll UserDataTypeHelperUtil.dll UserDeviceRegistration.dll UserDeviceRegistration.Ngc.dll userenv.dll userinit.exe userinitext.dll UserLanguageProfileCallback.dll usermgr.dll usermgrcli.dll UserMgrProxy.dll usk.rs usoapi.dll UsoClient.exe us ocoreps.dll usocoreworker.exe usosvc.dll usp10.dll ustprov.dll UtcDecoderHost.exe UtcManaged.dll utcutil.dll utildll.dll Utilman.exe uudf.dll UvcModel.dll uwfcfgmgmt.dll uwfcsp.dll uwfservicingapi.dll UXInit.dll uxlib.dll uxlibres.dll uxtheme.dll vac.dll VAN.dll Vaul t.dll VaultCDS.dll vaultcli.dll VaultCmd.exe VaultRoaming.dll vaultsvc.dll VBICodec.ax vbisurf.ax vbsapi.dll vbscript.dll vbssysprep.dll vcamp120.dll vcamp140.dll vcamp140d.dll VCardParser.dll vccorlib110.dll vccorlib120.dll vccorlib140.dll vccorlib140d.dll vcomp100. dll vcomp110.dll vcomp120.dll vcomp140.dll vcomp140d.dll vcruntime140.dll vcruntime140d.dll vcruntime140_1.dll vcruntime140_1d.dll vcruntime140_clr0400.dll vds.exe vdsbas.dll vdsdyn.dll vdsldr.exe vdsutil.dll vdsvd.dll vds_ps.dll verclsid.exe verifier.dll verifier.ex e verifiergui.exe version.dll vertdll.dll vfbasics.dll vfcompat.dll vfcuzz.dll vfluapriv.dll vfnet.dll vfntlmless.dll vfnws.dll vfprint.dll vfprintpthelper.dll vfrdvcompat.dll vfuprov.dll vfwwdm32.dll VhfUm.dll vid.dll vidcap.ax VideoHandlers.dll VIDRESZR.DLL virtdis k.dll VirtualMonitorManager.dll VmApplicationHealthMonitorProxy.dll vmbuspipe.dll vmdevicehost.dll vmictimeprovider.dll vmrdvcore.dll VocabRoamingHandler.dll VoiceActivationManager.dll VoipRT.dll vpnike.dll vpnikeapi.dll VpnSohDesktop.dll VPNv2CSP.dll vrfcore.dll Vsc MgrPS.dll vscover160.dll VSD3DWARPDebug.dll VsGraphicsCapture.dll VsGraphicsDesktopEngine.exe VsGraphicsExperiment.dll VsGraphicsHelper.dll VsGraphicsProxyStub.dll VsGraphicsRemoteEngine.exe vsjitdebugger.exe VSPerf160.dll vssadmin.exe vssapi.dll vsstrace.dll VSSVC.e xe vss_ps.dll vulkan-1-999-0-0-0.dll vulkan-1.dll vulkaninfo-1-999-0-0-0.exe vulkaninfo.exe w32time.dll w32tm.exe w32topl.dll WaaSAssessment.dll WaaSMedicAgent.exe WaaSMedicCapsule.dll WaaSMedicPS.dll WaaSMedicSvc.dll WABSyncProvider.dll waitfor.exe WalletBackgroundS erviceProxy.dll WalletProxy.dll WalletService.dll WallpaperHost.exe wavemsp.dll wbadmin.exe wbem wbemcomn.dll wbengine.exe wbiosrvc.dll wci.dll wcimage.dll wcmapi.dll wcmcsp.dll wcmsvc.dll WCN WcnApi.dll wcncsvc.dll WcnEapAuthProxy.dll WcnEapPeerProxy.dll WcnNetsh.dl l wcnwiz.dll wc_storage.dll wdc.dll WDI wdi.dll wdigest.dll wdmaud.drv wdscore.dll WdsUnattendTemplate.xml WEB.rs webauthn.dll WebcamUi.dll webcheck.dll WebClnt.dll webio.dll webplatstorageserver.dll WebRuntimeManager.dll webservices.dll Websocket.dll wecapi.dll wecs vc.dll wecutil.exe wephostsvc.dll wer.dll werconcpl.dll wercplsupport.dll werdiagcontroller.dll WerEnc.dll weretw.dll WerFault.exe WerFaultSecure.exe wermgr.exe wersvc.dll werui.dll wevtapi.dll wevtfwd.dll wevtsvc.dll wevtutil.exe wextract.exe WF.msc wfapigp.dll wfdp rov.dll WFDSConMgr.dll WFDSConMgrSvc.dll WfHC.dll WFS.exe WFSR.dll whealogr.dll where.exe whhelper.dll whoami.exe wiaacmgr.exe wiaaut.dll wiadefui.dll wiadss.dll WiaExtensionHost64.dll wiarpc.dll wiascanprofiles.dll wiaservc.dll wiashext.dll wiatrace.dll wiawow64.exe WiFiCloudStore.dll WiFiConfigSP.dll wifidatacapabilityhandler.dll WiFiDisplay.dll wifinetworkmanager.dll wifitask.exe WimBootCompress.ini wimgapi.dll wimserv.exe win32appinventorycsp.dll Win32AppSettingsProvider.dll Win32CompatibilityAppraiserCSP.dll win32k.sys win3 2kbase.sys win32kfull.sys win32kns.sys win32spl.dll win32u.dll Win32_DeviceGuard.dll winbio.dll WinBioDatabase WinBioDataModel.dll WinBioDataModelOOBE.exe winbioext.dll WinBioPlugIns winbrand.dll wincorlib.dll wincredprovider.dll wincredui.dll WindowManagement.dll Wi ndowManagementAPI.dll Windows.AccountsControl.dll Windows.AI.MachineLearning.dll Windows.AI.MachineLearning.Preview.dll Windows.ApplicationModel.Background.SystemEventsBroker.dll Windows.ApplicationModel.Background.TimeBroker.dll Windows.ApplicationModel.Conversation alAgent.dll windows.applicationmodel.conversationalagent.internal.proxystub.dll windows.applicationmodel.conversationalagent.proxystub.dll Windows.ApplicationModel.Core.dll windows.applicationmodel.datatransfer.dll Windows.ApplicationModel.dll Windows.ApplicationMode l.LockScreen.dll Windows.ApplicationModel.Store.dll Windows.ApplicationModel.Store.Preview.DOSettings.dll Windows.ApplicationModel.Store.TestingFramework.dll Windows.ApplicationModel.Wallet.dll Windows.CloudStore.dll Windows.CloudStore.Schema.DesktopShell.dll Windows .CloudStore.Schema.Shell.dll Windows.Cortana.Desktop.dll Windows.Cortana.OneCore.dll Windows.Cortana.ProxyStub.dll Windows.Data.Activities.dll Windows.Data.Pdf.dll Windows.Devices.AllJoyn.dll Windows.Devices.Background.dll Windows.Devices.Background.ps.dll Windows.De vices.Bluetooth.dll Windows.Devices.Custom.dll Windows.Devices.Custom.ps.dll Windows.Devices.Enumeration.dll Windows.Devices.Haptics.dll Windows.Devices.HumanInterfaceDevice.dll Windows.Devices.Lights.dll Windows.Devices.LowLevel.dll Windows.Devices.Midi.dll Windows. Devices.Perception.dll Windows.Devices.Picker.dll Windows.Devices.PointOfService.dll Windows.Devices.Portable.dll Windows.Devices.Printers.dll Windows.Devices.Printers.Extensions.dll Windows.Devices.Radios.dll Windows.Devices.Scanners.dll Windows.Devices.Sensors.dll Windows.Devices.SerialCommunication.dll Windows.Devices.SmartCards.dll Windows.Devices.SmartCards.Phone.dll Windows.Devices.Usb.dll Windows.Devices.WiFi.dll Windows.Devices.WiFiDirect.dll Windows.Energy.dll Windows.FileExplorer.Common.dll Windows.Gaming.Input.dll Win dows.Gaming.Preview.dll Windows.Gaming.UI.GameBar.dll Windows.Gaming.XboxLive.Storage.dll Windows.Globalization.dll Windows.Globalization.Fontgroups.dll Windows.Globalization.PhoneNumberFormatting.dll Windows.Graphics.Display.BrightnessOverride.dll Windows.Graphics.D isplay.DisplayEnhancementOverride.dll Windows.Graphics.dll Windows.Graphics.Printing.3D.dll Windows.Graphics.Printing.dll Windows.Graphics.Printing.Workflow.dll Windows.Graphics.Printing.Workflow.Native.dll Windows.Help.Runtime.dll windows.immersiveshell.serviceprovi der.dll Windows.Internal.AdaptiveCards.XamlCardRenderer.dll Windows.Internal.Bluetooth.dll Windows.Internal.CapturePicker.Desktop.dll Windows.Internal.CapturePicker.dll Windows.Internal.Devices.Sensors.dll Windows.Internal.Feedback.Analog.dll Windows.Internal.Feedbac k.Analog.ProxyStub.dll Windows.Internal.Graphics.Display.DisplayColorManagement.dll Windows.Internal.Graphics.Display.DisplayEnhancementManagement.dll Windows.Internal.Management.dll Windows.Internal.Management.SecureAssessment.dll Windows.Internal.PlatformExtension. DevicePickerExperience.dll Windows.Internal.PlatformExtension.MiracastBannerExperience.dll Windows.Internal.PredictionUnit.dll Windows.Internal.Security.Attestation.DeviceAttestation.dll Windows.Internal.SecurityMitigationsBroker.dll Windows.Internal.Shell.Broker.dll windows.internal.shellcommon.AccountsControlExperience.dll windows.internal.shellcommon.AppResolverModal.dll Windows.Internal.ShellCommon.Broker.dll windows.internal.shellcommon.FilePickerExperienceMEM.dll Windows.Internal.ShellCommon.PrintExperience.dll windows.int ernal.shellcommon.shareexperience.dll windows.internal.shellcommon.TokenBrokerModal.dll Windows.Internal.Signals.dll Windows.Internal.System.UserProfile.dll Windows.Internal.Taskbar.dll Windows.Internal.UI.BioEnrollment.ProxyStub.dll Windows.Internal.UI.Logon.ProxySt ub.dll Windows.Internal.UI.Shell.WindowTabManager.dll Windows.Management.EnrollmentStatusTracking.ConfigProvider.dll Windows.Management.InprocObjects.dll Windows.Management.ModernDeployment.ConfigProviders.dll Windows.Management.Provisioning.ProxyStub.dll Windows.Man agement.SecureAssessment.CfgProvider.dll Windows.Management.SecureAssessment.Diagnostics.dll Windows.Management.Service.dll Windows.Management.Workplace.dll Windows.Management.Workplace.WorkplaceSettings.dll Windows.Media.Audio.dll Windows.Media.BackgroundMediaPlayba ck.dll Windows.Media.BackgroundPlayback.exe Windows.Media.Devices.dll Windows.Media.dll Windows.Media.Editing.dll Windows.Media.FaceAnalysis.dll Windows.Media.Import.dll Windows.Media.MediaControl.dll Windows.Media.MixedRealityCapture.dll Windows.Media.Ocr.dll Window s.Media.Playback.BackgroundMediaPlayer.dll Windows.Media.Playback.MediaPlayer.dll Windows.Media.Playback.ProxyStub.dll Windows.Media.Protection.PlayReady.dll Windows.Media.Renewal.dll Windows.Media.Speech.dll Windows.Media.Speech.UXRes.dll Windows.Media.Streaming.dll Windows.Media.Streaming.ps.dll Windows.Mirage.dll Windows.Mirage.Internal.Capture.Pipeline.ProxyStub.dll Windows.Mirage.Internal.dll Windows.Networking.BackgroundTransfer.BackgroundManagerPolicy.dll Windows.Networking.BackgroundTransfer.ContentPrefetchTask.dll Windo ws.Networking.BackgroundTransfer.dll Windows.Networking.Connectivity.dll Windows.Networking.dll Windows.Networking.HostName.dll Windows.Networking.NetworkOperators.ESim.dll Windows.Networking.NetworkOperators.HotspotAuthentication.dll Windows.Networking.Proximity.dll Windows.Networking.ServiceDiscovery.Dnssd.dll Windows.Networking.Sockets.PushEnabledApplication.dll Windows.Networking.UX.EapRequestHandler.dll Windows.Networking.Vpn.dll Windows.Networking.XboxLive.ProxyStub.dll Windows.Payments.dll Windows.Perception.Stub.dll Wind ows.Security.Authentication.Identity.Provider.dll Windows.Security.Authentication.OnlineId.dll Windows.Security.Authentication.Web.Core.dll Windows.Security.Credentials.UI.CredentialPicker.dll Windows.Security.Credentials.UI.UserConsentVerifier.dll Windows.Security.I ntegrity.dll Windows.Services.TargetedContent.dll Windows.SharedPC.AccountManager.dll Windows.SharedPC.CredentialProvider.dll Windows.Shell.BlueLightReduction.dll Windows.Shell.ServiceHostBuilder.dll Windows.Shell.StartLayoutPopulationEvents.dll Windows.StateReposito ry.dll Windows.StateRepositoryBroker.dll Windows.StateRepositoryClient.dll Windows.StateRepositoryCore.dll Windows.StateRepositoryPS.dll Windows.StateRepositoryUpgrade.dll Windows.Storage.ApplicationData.dll Windows.Storage.Compression.dll windows.storage.dll Windows .Storage.OneCore.dll Windows.Storage.Search.dll Windows.System.Diagnostics.dll Windows.System.Diagnostics.Telemetry.PlatformTelemetryClient.dll Windows.System.Diagnostics.TraceReporting.PlatformDiagnosticActions.dll Windows.System.Launcher.dll Windows.System.Profile. HardwareId.dll Windows.System.Profile.PlatformDiagnosticsAndUsageDataSettings.dll Windows.System.Profile.RetailInfo.dll Windows.System.Profile.SystemId.dll Windows.System.Profile.SystemManufacturers.dll Windows.System.RemoteDesktop.dll Windows.System.SystemManagement .dll Windows.System.UserDeviceAssociation.dll Windows.System.UserProfile.DiagnosticsSettings.dll Windows.UI.Accessibility.dll Windows.UI.AppDefaults.dll Windows.UI.BioFeedback.dll Windows.UI.BlockedShutdown.dll Windows.UI.Core.TextInput.dll Windows.UI.Cred.dll Window s.UI.CredDialogController.dll Windows.UI.dll Windows.UI.FileExplorer.dll Windows.UI.Immersive.dll Windows.UI.Input.Inking.Analysis.dll Windows.UI.Input.Inking.dll Windows.UI.Internal.Input.ExpressiveInput.dll Windows.UI.Internal.Input.ExpressiveInput.Resource.dll Win dows.UI.Logon.dll Windows.UI.NetworkUXController.dll Windows.UI.PicturePassword.dll Windows.UI.Search.dll Windows.UI.Shell.dll Windows.UI.Shell.Internal.AdaptiveCards.dll Windows.UI.Storage.dll Windows.UI.Xaml.Controls.dll Windows.UI.Xaml.dll Windows.UI.Xaml.InkContr ols.dll Windows.UI.Xaml.Maps.dll Windows.UI.Xaml.Phone.dll Windows.UI.Xaml.Resources.19h1.dll Windows.UI.Xaml.Resources.Common.dll Windows.UI.Xaml.Resources.rs1.dll Windows.UI.Xaml.Resources.rs2.dll Windows.UI.Xaml.Resources.rs3.dll Windows.UI.Xaml.Resources.rs4.dll Windows.UI.Xaml.Resources.rs5.dll Windows.UI.Xaml.Resources.th.dll Windows.UI.Xaml.Resources.win81.dll Windows.UI.Xaml.Resources.win8rtm.dll Windows.UI.XamlHost.dll Windows.WARP.JITService.dll Windows.WARP.JITService.exe Windows.Web.Diagnostics.dll Windows.Web.dll Wi ndows.Web.Http.dll WindowsActionDialog.exe WindowsCodecs.dll WindowsCodecsExt.dll WindowsCodecsRaw.dll WindowsCodecsRaw.txt WindowsDefaultHeatProcessor.dll windowsdefenderapplicationguardcsp.dll WindowsInternal.ComposableShell.ComposerFramework.dll WindowsInternal.Co mposableShell.DesktopHosting.dll WindowsInternal.Shell.CompUiActivation.dll WindowsIoTCsp.dll windowslivelogin.dll WindowsManagementServiceWinRt.ProxyStub.dll windowsperformancerecordercontrol.dll WindowsPowerShell WindowsSecurityIcon.png windowsudk.shellcommon.dll W indowsUpdateElevatedInstaller.exe winethc.dll winevt WinFax.dll winhttp.dll winhttpcom.dll WinHvEmulation.dll WinHvPlatform.dll wininet.dll wininetlui.dll wininit.exe wininitext.dll winipcfile.dll winipcsecproc.dll winipsec.dll winjson.dll Winlangdb.dll winload.efi w inload.exe winlogon.exe winlogonext.dll winmde.dll WinMetadata winml.dll winmm.dll winmmbase.dll winmsipc.dll WinMsoIrmProtector.dll winnlsres.dll winnsi.dll WinOpcIrmProtector.dll WinREAgent.dll winresume.efi winresume.exe winrm winrm.cmd winrm.vbs winrnr.dll winrs. exe winrscmd.dll winrshost.exe winrsmgr.dll winrssrv.dll WinRTNetMUAHostServer.exe WinRtTracing.dll WinSAT.exe WinSATAPI.dll WinSCard.dll WinSetupUI.dll winshfhc.dll winsku.dll winsockhc.dll winspool.drv winsqlite3.dll WINSRPC.DLL winsrv.dll winsrvext.dll winsta.dll WinSync.dll WinSyncMetastore.dll WinSyncProviders.dll wintrust.dll WinTypes.dll winusb.dll winver.exe WiredNetworkCSP.dll wisp.dll witnesswmiv2provider.dll wkscli.dll wkspbroker.exe wkspbrokerAx.dll wksprt.exe wksprtPS.dll wkssvc.dll wlanapi.dll wlancfg.dll WLanConn. dll wlandlg.dll wlanext.exe wlangpui.dll WLanHC.dll wlanhlp.dll WlanMediaManager.dll WlanMM.dll wlanmsm.dll wlanpref.dll WlanRadioManager.dll wlansec.dll wlansvc.dll wlansvcpal.dll wlanui.dll wlanutil.dll Wldap32.dll wldp.dll wlgpclnt.dll wlidcli.dll wlidcredprov.dll wlidfdp.dll wlidnsp.dll wlidprov.dll wlidres.dll wlidsvc.dll wlrmdr.exe WMADMOD.DLL WMADMOE.DLL WMALFXGFXDSP.dll WMASF.DLL wmcodecdspps.dll wmdmlog.dll wmdmps.dll wmdrmsdk.dll wmerror.dll wmi.dll wmiclnt.dll wmicmiplugin.dll wmidcom.dll wmidx.dll WmiMgmt.msc wmiprop .dll wmitomi.dll WMNetMgr.dll wmp.dll WMPDMC.exe WmpDui.dll wmpdxm.dll wmpeffects.dll WMPhoto.dll wmploc.DLL wmpps.dll wmpshell.dll wmsgapi.dll WMSPDMOD.DLL WMSPDMOE.DLL WMVCORE.DLL WMVDECOD.DLL wmvdspa.dll WMVENCOD.DLL WMVSDECD.DLL WMVSENCD.DLL WMVXENCD.DLL WofTasks .dll WofUtil.dll WordBreakers.dll WorkFolders.exe WorkfoldersControl.dll WorkFoldersGPExt.dll WorkFoldersRes.dll WorkFoldersShell.dll workfolderssvc.dll wosc.dll wow64.dll wow64cpu.dll wow64win.dll wowreg32.exe WpAXHolder.dll wpbcreds.dll Wpc.dll WpcApi.dll wpcatltoa st.png WpcDesktopMonSvc.dll WpcMon.exe wpcmon.png WpcProxyStubs.dll WpcRefreshTask.dll WpcTok.exe WpcWebFilter.dll wpdbusenum.dll WpdMtp.dll WpdMtpUS.dll wpdshext.dll WPDShextAutoplay.exe WPDShServiceObj.dll WPDSp.dll wpd_ci.dll wpnapps.dll wpnclient.dll wpncore.dll wpninprc.dll wpnpinst.exe wpnprv.dll wpnservice.dll wpnsruprov.dll WpnUserService.dll WpPortingLibrary.dll WppRecorderUM.dll wpr.config.xml wpr.exe WPTaskScheduler.dll wpx.dll write.exe ws2help.dll ws2_32.dll wscadminui.exe wscapi.dll wscinterop.dll wscisvif.dll WSCl ient.dll WSCollect.exe wscproxystub.dll wscript.exe wscsvc.dll wscui.cpl WSDApi.dll wsdchngr.dll WSDPrintProxy.DLL WsdProviderUtil.dll WSDScanProxy.dll wsecedit.dll wsepno.dll wshbth.dll wshcon.dll wshelper.dll wshext.dll wshhyperv.dll wship6.dll wshom.ocx wshqos.dll wshrm.dll WSHTCPIP.DLL wshunix.dll wsl.exe wslapi.dll WsmAgent.dll wsmanconfig_schema.xml WSManHTTPConfig.exe WSManMigrationPlugin.dll WsmAuto.dll wsmplpxy.dll wsmprovhost.exe WsmPty.xsl WsmRes.dll WsmSvc.dll WsmTxt.xsl WsmWmiPl.dll wsnmp32.dll wsock32.dll wsplib.dl l wsp_fs.dll wsp_health.dll wsp_sr.dll wsqmcons.exe WSReset.exe WSTPager.ax wtsapi32.dll wuapi.dll wuapihost.exe wuauclt.exe wuaueng.dll wuceffects.dll WUDFCoinstaller.dll WUDFCompanionHost.exe WUDFHost.exe WUDFPlatform.dll WudfSMCClassExt.dll WUDFx.dll WUDFx02000.dl l wudriver.dll wups.dll wups2.dll wusa.exe wuuhext.dll wuuhosdeployment.dll wvc.dll WwaApi.dll WwaExt.dll WWAHost.exe WWanAPI.dll wwancfg.dll wwanconn.dll WWanHC.dll wwanmm.dll Wwanpref.dll wwanprotdim.dll WwanRadioManager.dll wwansvc.dll wwapi.dll XamlTileRender.dll XAudio2_8.dll XAudio2_9.dll XblAuthManager.dll XblAuthManagerProxy.dll XblAuthTokenBrokerExt.dll XblGameSave.dll XblGameSaveExt.dll XblGameSaveProxy.dll XblGameSaveTask.exe XboxGipRadioManager.dll xboxgipsvc.dll xboxgipsynthetic.dll XboxNetApiSvc.dll xcopy.exe XInput1_4.dll XInput9_1_0.dll XInputUap.dll xmlfilter.dll xmllite.dll xmlprovi.dll xolehlp.dll XpsDocumentTargetPrint.dll XpsGdiConverter.dll XpsPrint.dll xpspushlayer.dll XpsRasterService.dll xpsservices.dll XpsToPclmConverter.dll XpsToPwgrConverter.dll xwizard.dtd xwizard.exe xwizards.dll xwreg.dll xwtpdui.dll xwtpw32.dll X_80.contrast-black.png X_80.contrast-white.png X_80.png ze_loader.dll ze_tracing_layer.dll ze_validation_layer.dll zh-CN zh-TW zipcontainer.dll zipfldr.dll ztrace_maps.dll /cygdrive/c/Windows: addins AhnInst.log appcompat Application Data apppatch AppReadiness assembly bcastdvr bfsvc.exe BitLockerDiscoveryVolumeContents Boot bootstat.dat Branding CbsTemp Containers CSC Cursors debug diagnostics DiagTrack DigitalLocker Downloaded Program Files DtcInstall.log ELAMBKUP en-US explorer.exe Fonts GameBarPresenceWriter gethelp_audiotroubleshooter_latestpackage.zip Globalization Help HelpPane.exe hh.exe hipiw.dll IdentityCRL ImageSAFERSvc.exe IME IMGSF50Svc.exe ImmersiveControlPanel INF InputMethod Installer ko-KR L2Schemas LanguageOverlayCache LiveKernelReports Logs lsasetup.log Media mib.bin Microsoft.NET Migration ModemLogs notepad.exe OCR Offline Web Pages Panther Performance PFRO.log PLA PolicyDefinitions Prefetch PrintDialog Professional.xml Provisioning regedit.exe Registration RemotePackages rescache Resources RtlExUpd.dll SchCache schemas security ServiceProfiles ServiceState servicing Setup setupact.log setuperr.log ShellComponents ShellExperiences SHELLNEW SKB SoftwareDistribution Speech Speech_OneCore splwow64. exe System system.ini System32 SystemApps SystemResources SystemTemp SysWOW64 TAPI Tasks Temp TempInst tracing twain_32 twain_32.dll Vss WaaS Web win.ini WindowsShell.Manifest WindowsUpdate.log winhlp32.exe WinSxS WMSysPr9.prx write.exe /cygdrive/c/Windows/System32/Wbem: aeinv.mof AgentWmi.mof AgentWmiUninstall.mof appbackgroundtask.dll appbackgroundtask.mof appbackgroundtask_uninstall.mof AuditRsop.mof authfwcfg.mof AutoRecover bcd.mof BthMtpEnum.mof cimdmtf.mof cimwin32.dll cimwin32.mof CIWm i.mof classlog.mof cli.mof cliegaliases.mof ddp.mof dimsjob.mof dimsroam.mof DMWmiBridgeProv.dll DMWmiBridgeProv.mof DMWmiBridgeProv1.dll DMWmiBridgeProv1.mof DMWmiBridgeProv1_Uninstall.mof DMWmiBridgeProv_Uninstall.mof dnsclientcim.dll dnsclientcim.mof dnsclientpspr ovider.dll dnsclientpsprovider.mof dnsclientpsprovider_Uninstall.mof drvinst.mof DscCore.mof DscCoreConfProv.mof dscproxy.mof Dscpspluginwkr.dll DscTimer.mof dsprov.dll dsprov.mof eaimeapi.mof EmbeddedLockdownWmi.dll embeddedlockdownwmi.mof embeddedlockdownwmi_Uninst all.mof en en-US esscli.dll EventTracingManagement.dll EventTracingManagement.mof fastprox.dll fdPHost.mof fdrespub.mof fdSSDP.mof fdWNet.mof fdWSD.mof filetrace.mof firewallapi.mof FolderRedirectionWMIProvider.mof FunDisc.mof fwcfg.mof hbaapi.mof hnetcfg.mof IMAPIv2 -Base.mof IMAPIv2-FileSystemSupport.mof IMAPIv2-LegacyShim.mof interop.mof IpmiDTrc.mof ipmiprr.dll ipmiprv.dll ipmiprv.mof IpmiPTrc.mof ipsecsvc.mof iscsidsc.mof iscsihba.mof iscsiprf.mof iscsirem.mof iscsiwmiv2.mof iscsiwmiv2_uninstall.mof kerberos.mof ko ko-KR Krn lProv.dll krnlprov.mof L2SecHC.mof lltdio.mof lltdsvc.mof Logs lsasrv.mof mblctr.mof MDMAppProv.dll MDMAppProv.mof MDMAppProv_Uninstall.mof MDMSettingsProv.dll MDMSettingsProv.mof MDMSettingsProv_Uninstall.mof Microsoft-Windows-OfflineFiles.mof Microsoft-Windows-Remo te-FileSystem.mof Microsoft.AppV.AppVClientWmi.dll Microsoft.AppV.AppVClientWmi.mof Microsoft.Uev.AgentWmi.dll Microsoft.Uev.ManagedAgentWmi.mof Microsoft.Uev.ManagedAgentWmiUninstall.mof mispace.mof mispace_uninstall.mof mmc.mof MMFUtil.dll MOF mofcomp.exe mofd.dll mofinstall.dll mountmgr.mof mpeval.mof mpsdrv.mof mpssvc.mof msdtcwmi.dll MsDtcWmi.mof msfeeds.mof msfeedsbs.mof msi.mof msiprov.dll msiscsi.mof MsNetImPlatform.mof mstsc.mof mstscax.mof msv1_0.mof mswmdm.mof NCProv.dll ncprov.mof ncsi.mof ndisimplatcim.dll ndistrace .mof NetAdapterCim.dll NetAdapterCim.mof NetAdapterCimTrace.mof NetAdapterCimTraceUninstall.mof NetAdapterCim_uninstall.mof netdacim.dll netdacim.mof netdacim_uninstall.mof NetEventPacketCapture.dll NetEventPacketCapture.mof NetEventPacketCapture_uninstall.mof netncc im.dll netnccim.mof netnccim_uninstall.mof NetPeerDistCim.dll NetPeerDistCim.mof NetPeerDistCim_uninstall.mof netprofm.mof NetSwitchTeam.mof netswitchteamcim.dll NetTCPIP.dll NetTCPIP.mof NetTCPIP_Uninstall.mof netttcim.dll netttcim.mof netttcim_uninstall.mof network itemfactory.mof newdev.mof nlasvc.mof nlmcim.dll nlmcim.mof nlmcim_uninstall.mof nlsvc.mof npivwmi.mof nshipsec.mof ntevt.dll ntevt.mof ntfs.mof OfflineFilesConfigurationWmiProvider.mof OfflineFilesConfigurationWmiProvider_Uninstall.mof OfflineFilesWmiProvider.mof Of flineFilesWmiProvider_Uninstall.mof p2p-mesh.mof p2p-pnrp.mof pcsvDevice.mof pcsvDevice_Uninstall.mof Performance PNPXAssoc.mof PolicMan.dll PolicMan.mof polproc.mof polprocl.mof polprou.mof polstore.mof portabledeviceapi.mof portabledeviceclassextension.mof portable deviceconnectapi.mof portabledevicetypes.mof portabledevicewiacompat.mof powermeterprovider.mof PowerPolicyProvider.mof ppcRsopCompSchema.mof ppcRsopUserSchema.mof PrintFilterPipelineSvc.mof PrintManagementProvider.dll PrintManagementProvider.mof PrintManagementProvider_Uninstall.mof profileassociationprovider.mof PS_MMAgent.mof qmgr.mof qoswmi.dll qoswmi.mof qoswmitrc.mof qoswmitrc_uninstall.mof qoswmi_uninstall.mof RacWmiProv.dll RacWmiProv.mof rawxml.xsl rdpendp.mof rdpinit.mof rdpshell.mof refs.mof refsv1.mof regevent.mof Remove.Microsoft.AppV.AppvClientWmi.mof repdrvfs.dll Repository rsop.mof rspndr.mof samsrv.mof scersop.mof schannel.mof schedprov.dll SchedProv.mof scm.mof scrcons.exe scrcons.mof sdbus.mof secrcw32.mof SensorsClassExtension.mof ServDeps.dll ServiceModel.mof ServiceModel.mof.uninstall ServiceModel35.mof ServiceModel35.mof.uninstall services.mof setupapi.mof SmbWitnessWmiv2Provider.mof smbwmiv2.mof SMTPCons.dll smtpcons.mof sppwmi.mof sr.mof sstpsvc.mof stdprov.dll storagewmi.mof storagewmi_passthru.mof storagewmi_passthru_uninstall.mof storagewmi_uninstall.mof stortrace.mof subscrpt.mof system.mof tcpip.mof texttable.xsl textvaluelist.xsl tmf tsallow.mof tscfgwmi.mof tsmf.mof tspkg.mof umb.mof umbus.mof umpass.mof umpnpmgr.mof unsecapp.exe UserProfileConfigurationWmiProvider.mof UserProfileWmiProvider.mof UserStateWMIProvider.mof vds.mof vdswmi.dll viewprov.dll vpnclientpsprovider.dll vpnclientpsprovider.mof vpnclientpsprovider_Uninstall.mof vss.mof vsswmi.dll wbemcntl.dll wbemcons.dll WBEMCons.mof wbemcore.dll wbemdisp.dll wbemdisp.tlb wbemess.dll wbemprox.dll wbemsvc.dll wbemtest.exe wcncsvc.mof WdacEtwProv.mof WdacWmiProv.dll WdacWmiProv.mof WdacWmiProv_Uninstall.mof Wdf01000.mof Wdf01000Uninstall.mof wdigest.mof WFAPIGP.mof wfascim.dll wfascim.mof wfascim_uninstall.mof WFP.MOF wfs.mof whqlprov.mof Win32_DeviceGuard.mof Win32_EncryptableVolume.dll win32_encryptablevolume.mof Win32_EncryptableVolumeUninstall.mof win32_printer.mof Win32_Tpm.dll Win32_Tpm.mof wininit.mof winipsec.mof winlogon.mof WinMgmt.exe WinMgmtR.dll Winsat.mof WinsatUninstall.mof wlan.mof WLanHC.mof wmi.mof WMIADAP.exe WmiApRes.dll WmiApRpl.dll WmiApSrv.exe WMIC.exe WMICOOKR.dll WmiDcPrv.dll wmipcima.dll wmipcima.mof wmipdfs.dll wmipdfs.mof wmipdskq.dll wmipdskq.mof WmiPerfClass.dll WmiPerfClass.mof WmiPerfInst.dll WmiPerfInst.mof WMIPICMP.dll wmipicmp.mof WMIPIPRT.dll wmipiprt.mof WMIPJOBJ.dll wmipjobj.mof wmiprov.dll WmiPrvSD.dll WmiPrvSE.exe WMIPSESS.dll wmipsess.mof WMIsvc.dll wmitimep.dll wmitimep.mof wmiutils.dll WMI_Tracing.mof wmp.mof wmpnetwk.mof wpdbusenum.mof wpdcomp.mof wpdfs.mof wpdmtp.mof wpdshext.mof WPDShServiceObj.mof wpdsp.mof wpd_ci.mof wscenter.mof WsmAgent.mof WsmAgentUninstall.mof WsmAuto.mof wsp_fs.mof wsp_fs_uninstall.mof wsp_health.mof wsp_health_uninstall.mof wsp_sr.mof wsp_sr_uninstall.mof WUDFx.mof Wudfx02000.mof Wudfx02000Uninstall.mof WUDFxUninstall.mof xml xsl-mappings.xml xwizards.mof /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0: Certificate.format.ps1xml Diagnostics.Format.ps1xml DotNetTypes.format.ps1xml en en-US Event.Format.ps1xml Examples FileSystem.format.ps1xml getevent.types.ps1xml Help.format.ps1xml HelpV3.format.ps1xml ko ko-KR Modules powershell.exe powershell.exe.config PowerShellCore.format.ps1xml PowerShellTrace.format.ps1xml powershell_ise.exe powershell_ise.exe.config PSEvents.dll pspluginwkr.dll pwrshmsg.dll pwrshsip.dll Registry.format.ps1xml Schemas SessionConfig types.ps1xml typesv3.ps1xml WSMan.Format.ps1xml /cygdrive/c/Windows/System32/OpenSSH: scp.exe sftp.exe ssh-add.exe ssh-agent.exe ssh-keygen.exe ssh-keyscan.exe ssh.exe /cygdrive/c/Program Files/MATLAB/R2020b/bin: crash_analyzer.cfg icutzdata lcdata.xml lcdata.xsd lcdata_utf8.xml m3iregistry matlab.exe mex.bat mexext.bat util win32 win64 /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn: Resources SqlLocalDB.exe /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn: batchparser.dll bcp.exe Resources SQLCMD.EXE xmlrw.dll /cygdrive/c/Program Files/Git/cmd: git-gui.exe git-lfs.exe git.exe gitk.exe start-ssh-agent.cmd start-ssh-pageant.cmd Warning accessing /cygdrive/c/msys64/mingw64/bin gives errors: [Errno 2] No such file or directory: '/cygdrive/c/msys64/mingw64/bin' Warning accessing /cygdrive/c/msys64/usr/bin gives errors: [Errno 2] No such file or directory: '/cygdrive/c/msys64/usr/bin' /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config llvm-symbolizer.exe LocalESPC.dll Microsoft.Diagnostics.Tracing.EventSource.dll Microsoft.VisualStudio.RemoteControl.dll Microsoft.VisualStudio.Telemetry.dll Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll nmake.exe onecore perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll vctip.exe xdcmake.exe xdcmake.exe.config /cygdrive/c/Program Files/dotnet: dotnet.exe host LICENSE.txt packs sdk shared templates ThirdPartyNotices.txt /: bin Cygwin-Terminal.ico Cygwin.bat Cygwin.ico dev etc home lib mpich-4.0.2 mpich-4.0.2.tar.gz sbin tmp usr var proc cygdrive /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps: Backup GameBarElevatedFT_Alias.exe Microsoft.DesktopAppInstaller_8wekyb3d8bbwe Microsoft.MicrosoftEdge_8wekyb3d8bbwe Microsoft.SkypeApp_kzf8qxf38zg5c Microsoft.XboxGamingOverlay_8wekyb3d8bbwe MicrosoftEdge.exe python.exe python3.exe Skype.exe winget.exe /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin: code code.cmd /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config llvm-symbolizer.exe LocalESPC.dll Microsoft.Diagnostics.Tracing.EventSource.dll Microsoft.VisualStudio.RemoteControl.dll Microsoft.VisualStudio.Telemetry.dll Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll nmake.exe onecore perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll vctip.exe xdcmake.exe xdcmake.exe.config Warning accessing /cygdrive/c/Users/SEJONG/.dotnet/tools gives errors: [Errno 2] No such file or directory: '/cygdrive/c/Users/SEJONG/.dotnet/tools' /usr/lib/lapack: cygblas-0.dll cyglapack-0.dll ============================================================================================= TESTING: configureExternalPackagesDir from config.framework(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py:1045) Set alternative directory external packages are built in serialEvaluation: initial cxxDialectRanges ('c++11', 'c++17') serialEvaluation: new cxxDialectRanges ('c++11', 'c++17') child config.utilities.macosFirewall took 0.000005 seconds ============================================================================================= TESTING: configureDebuggers from config.utilities.debuggers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/utilities/debuggers.py:20) Find a default debugger and determine its arguments Checking for program /usr/local/bin/gdb...not found Checking for program /usr/bin/gdb...not found Checking for program /cygdrive/c/SIMULIA/Commands/gdb...not found Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/gdb...not found Checking for program /cygdrive/c/Windows/system32/gdb...not found Checking for program /cygdrive/c/Windows/gdb...not found Checking for program /cygdrive/c/Windows/System32/Wbem/gdb...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/gdb...not found Checking for program /cygdrive/c/Windows/System32/OpenSSH/gdb...not found Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/gdb...not found Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/gdb...not found Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/gdb...not found Checking for program /cygdrive/c/Program Files/Git/cmd/gdb...not found Checking for program /cygdrive/c/msys64/mingw64/bin/gdb...not found Checking for program /cygdrive/c/msys64/usr/bin/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not found Checking for program /cygdrive/c/Program Files/dotnet/gdb...not found Checking for program /gdb...not found Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/gdb...not found Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not found Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/gdb...not found Checking for program /usr/lib/lapack/gdb...not found Checking for program /usr/local/bin/dbx...not found Checking for program /usr/bin/dbx...not found Checking for program /cygdrive/c/SIMULIA/Commands/dbx...not found Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/dbx...not found Checking for program /cygdrive/c/Windows/system32/dbx...not found Checking for program /cygdrive/c/Windows/dbx...not found Checking for program /cygdrive/c/Windows/System32/Wbem/dbx...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/dbx...not found Checking for program /cygdrive/c/Windows/System32/OpenSSH/dbx...not found Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/dbx...not found Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/dbx...not found Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/dbx...not found Checking for program /cygdrive/c/Program Files/Git/cmd/dbx...not found Checking for program /cygdrive/c/msys64/mingw64/bin/dbx...not found Checking for program /cygdrive/c/msys64/usr/bin/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not found Checking for program /cygdrive/c/Program Files/dotnet/dbx...not found Checking for program /dbx...not found Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/dbx...not found Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not found Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/dbx...not found Checking for program /usr/lib/lapack/dbx...not found Defined make macro "DSYMUTIL" to "true" child config.utilities.debuggers took 0.014310 seconds ============================================================================================= TESTING: configureDirectories from PETSc.options.petscdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscdir.py:22) Checks PETSC_DIR and sets if not set PETSC_VERSION_RELEASE of 1 indicates the code is from a release branch or a branch created from a release branch. Version Information: #define PETSC_VERSION_RELEASE 1 #define PETSC_VERSION_MAJOR 3 #define PETSC_VERSION_MINOR 18 #define PETSC_VERSION_SUBMINOR 1 #define PETSC_VERSION_DATE "Oct 26, 2022" #define PETSC_VERSION_GIT "v3.18.1" #define PETSC_VERSION_DATE_GIT "2022-10-26 07:57:29 -0500" #define PETSC_VERSION_EQ(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_ PETSC_VERSION_EQ #define PETSC_VERSION_LT(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_LE(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_GT(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_GE(MAJOR,MINOR,SUBMINOR) \ child PETSc.options.petscdir took 0.015510 seconds ============================================================================================= TESTING: getDatafilespath from PETSc.options.dataFilesPath(/home/SEJONG/petsc-3.18.1/config/PETSc/options/dataFilesPath.py:29) Checks what DATAFILESPATH should be child PETSc.options.dataFilesPath took 0.002462 seconds ============================================================================================= TESTING: configureGit from config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:24) Find the Git executable Checking for program /usr/local/bin/git...not found Checking for program /usr/bin/git...found Defined make macro "GIT" to "git" Executing: git --version stdout: git version 2.38.1 ============================================================================================= TESTING: configureMercurial from config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:35) Find the Mercurial executable Checking for program /usr/local/bin/hg...not found Checking for program /usr/bin/hg...not found Checking for program /cygdrive/c/SIMULIA/Commands/hg...not found Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/hg...not found Checking for program /cygdrive/c/Windows/system32/hg...not found Checking for program /cygdrive/c/Windows/hg...not found Checking for program /cygdrive/c/Windows/System32/Wbem/hg...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/hg...not found Checking for program /cygdrive/c/Windows/System32/OpenSSH/hg...not found Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/hg...not found Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/hg...not found Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/hg...not found Checking for program /cygdrive/c/Program Files/Git/cmd/hg...not found Checking for program /cygdrive/c/msys64/mingw64/bin/hg...not found Checking for program /cygdrive/c/msys64/usr/bin/hg...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not found Checking for program /cygdrive/c/Program Files/dotnet/hg...not found Checking for program /hg...not found Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/hg...not found Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/hg...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not found Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/hg...not found Checking for program /usr/lib/lapack/hg...not found Checking for program /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/hg...not found child config.sourceControl took 0.121914 seconds ============================================================================================= TESTING: configureInstallationMethod from PETSc.options.petscclone(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscclone.py:20) Determine if PETSc was obtained via git or a tarball This is a tarball installation child PETSc.options.petscclone took 0.003125 seconds ============================================================================================= TESTING: setNativeArchitecture from PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:29) Forms the arch as GNU's configure would form it ============================================================================================= TESTING: configureArchitecture from PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:42) Checks if PETSC_ARCH is set and sets it if not set No previous hashfile found Setting hashfile: arch-mswin-c-debug/lib/petsc/conf/configure-hash Deleting configure hash file: arch-mswin-c-debug/lib/petsc/conf/configure-hash Unable to delete configure hash file: arch-mswin-c-debug/lib/petsc/conf/configure-hash child PETSc.options.arch took 0.149094 seconds ============================================================================================= TESTING: setInstallDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:31) Set installDir to either prefix or if that is not set to PETSC_DIR/PETSC_ARCH Defined make macro "PREFIXDIR" to "/home/SEJONG/petsc-3.18.1/arch-mswin-c-debug" ============================================================================================= TESTING: saveReconfigure from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:76) Save the configure options in a script in PETSC_ARCH/lib/petsc/conf so the same configure may be easily re-run ============================================================================================= TESTING: cleanConfDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:68) Remove all the files from configuration directory for this PETSC_ARCH, from --with-clean option ============================================================================================= TESTING: configureInstallDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:52) Makes installDir subdirectories if it does not exist for both prefix install location and PETSc work install location Changed persistence directory to /home/SEJONG/petsc-3.18.1/arch-mswin-c-debug/lib/petsc/conf TESTING: restoreReconfigure from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:90) If --with-clean was requested but restoring the reconfigure file was requested then restore it child PETSc.options.installDir took 0.006476 seconds ============================================================================================= TESTING: setExternalPackagesDir from PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:15) Set location where external packages will be downloaded to ============================================================================================= TESTING: cleanExternalpackagesDir from PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:23) Remove all downloaded external packages, from --with-clean child PETSc.options.externalpackagesdir took 0.000990 seconds ============================================================================================= TESTING: configureCLanguage from PETSc.options.languages(/home/SEJONG/petsc-3.18.1/config/PETSc/options/languages.py:28) Choose whether to compile the PETSc library using a C or C++ compiler C language is C Defined "CLANGUAGE_C" to "1" Defined make macro "CLANGUAGE" to "C" child PETSc.options.languages took 0.003172 seconds ============================================================================================= TESTING: resetEnvCompilers from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2652) Remove compilers from the shell environment so they do not interfer with testing ============================================================================================= TESTING: checkEnvCompilers from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2669) Set configure compilers from the environment, from -with-environment-variables ============================================================================================= TESTING: checkMPICompilerOverride from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2622) Check if --with-mpi-dir is used along with CC CXX or FC compiler options. This usually prevents mpi compilers from being used - so issue a warning ============================================================================================= TESTING: requireMpiLdPath from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2643) OpenMPI wrappers require LD_LIBRARY_PATH set ============================================================================================= TESTING: checkInitialFlags from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:723) Initialize the compiler and linker flags Initialized CFLAGS to Initialized CFLAGS to Initialized LDFLAGS to Initialized CUDAFLAGS to Initialized CUDAFLAGS to Initialized LDFLAGS to Initialized HIPFLAGS to Initialized HIPFLAGS to Initialized LDFLAGS to Initialized SYCLFLAGS to Initialized SYCLFLAGS to Initialized LDFLAGS to Initialized CXXFLAGS to Initialized CXX_CXXFLAGS to Initialized LDFLAGS to Initialized FFLAGS to Initialized FFLAGS to Initialized LDFLAGS to Initialized CPPFLAGS to Initialized FPPFLAGS to Initialized CUDAPPFLAGS to -Wno-deprecated-gpu-targets Initialized CXXPPFLAGS to Initialized HIPPPFLAGS to Initialized SYCLPPFLAGS to Initialized CC_LINKER_FLAGS to [] Initialized CXX_LINKER_FLAGS to [] Initialized FC_LINKER_FLAGS to [] Initialized CUDAC_LINKER_FLAGS to [] Initialized HIPC_LINKER_FLAGS to [] Initialized SYCLC_LINKER_FLAGS to [] TESTING: checkCCompiler from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:1341) Locate a functional C compiler Checking for program /usr/local/bin/mpicc...not found Checking for program /usr/bin/mpicc...found Defined make macro "CC" to "mpicc" Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o -I/tmp/petsc-uqt11yqc/config.setCompilers /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o -I/tmp/petsc-uqt11yqc/config.setCompilers /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicc -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.exe /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o Possible ERROR while running linker: exit code 1 stderr: /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory collect2: error: ld returned 1 exit status Linker output before filtering: /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory collect2: error: ld returned 1 exit status : Linker output after filtering: /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory collect2: error: ld returned 1 exit status: Error testing C compiler: Cannot compile/link C with mpicc. MPI compiler wrapper mpicc failed to compile Executing: mpicc -show stdout: gcc -L/usr/lib -lmpi -lopen-rte -lopen-pal -lhwloc -levent_core -levent_pthreads -lz MPI compiler wrapper mpicc is likely incorrect. Use --with-mpi-dir to indicate an alternate MPI. Deleting "CC" ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- C compiler you provided with -with-cc=mpicc cannot be found or does not work. Cannot compile/link C with mpicc. ******************************************************************************* File "/home/SEJONG/petsc-3.18.1/config/configure.py", line 461, in petsc_configure framework.configure(out = sys.stdout) File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1412, in configure self.processChildren() File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1400, in processChildren self.serialEvaluation(self.childGraph) File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1375, in serialEvaluation child.configure() File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 2712, in configure self.executeTest(self.checkCCompiler) File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/base.py", line 138, in executeTest ret = test(*args,**kargs) File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 1346, in checkCCompiler for compiler in self.generateCCompilerGuesses(): File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 1274, in generateCCompilerGuesses raise RuntimeError('C compiler you provided with -with-cc='+self.argDB['with-cc']+' cannot be found or does not work.'+'\n'+self.mesg) ================================================================================ Finishing configure run at Tue, 01 Nov 2022 13:06:09 +0900 -----Original Message----- From: Satish Balay Sent: Tuesday, November 1, 2022 11:36 AM To: Mohammad Ali Yaqteen Cc: petsc-users Subject: RE: [petsc-users] PETSc Windows Installation you'll have to send configure.log for this failure Satish On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > I have checked the required Cygwin openmpi libraries and they are all installed. When I run ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90, it returns: > > $ ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > ============================================================================================= > Configuring PETSc to compile on your system > ====================================================================== > ======================= > TESTING: checkCCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > ---------------------------------------------------------------------- > --------- C compiler you provided with -with-cc=mpicc cannot be found > or does not work. > Cannot compile/link C with mpicc. > > As for the case of WSL2, I will try to install that on my PC. > Meanwhile, could you please look into this issue > > Thank you > > Ali > > -----Original Message----- > From: Satish Balay > Sent: Monday, October 31, 2022 10:56 PM > To: Satish Balay via petsc-users > Cc: Matthew Knepley ; Mohammad Ali Yaqteen > > Subject: Re: [petsc-users] PETSc Windows Installation > > BTW: If you have WSL2 on windows - it might be easier to build/use PETSc. > > Satish > > On Mon, 31 Oct 2022, Satish Balay via petsc-users wrote: > > > Make sure you have cygwin openmpi installed [and cywin blas/lapack] > > > > $ cygcheck -cd |grep openmpi > > libopenmpi-devel 4.1.2-1 > > libopenmpi40 4.1.2-1 > > libopenmpifh40 4.1.2-1 > > libopenmpiusef08_40 4.1.2-1 > > libopenmpiusetkr40 4.1.2-1 > > openmpi 4.1.2-1 > > $ cygcheck -cd |grep lapack > > liblapack-devel 3.10.1-1 > > liblapack0 3.10.1-1 > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > --download-f2cblaslapack > > > > Should be: > > > > > > $ ./configure --download-scalapack --download-mumps > > > > i.e [default] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 [an > > default cygwin blas/lapack] > > > > Satish > > > > > > On Mon, 31 Oct 2022, Matthew Knepley wrote: > > > > > On Mon, Oct 31, 2022 at 1:56 AM Mohammad Ali Yaqteen > > > > > > wrote: > > > > > > > Dear Satish > > > > > > > > When I configure PETSc with (./configure --with-cc=gcc > > > > --with-cxx=0 > > > > --with-fc=0 --download-f2cblaslapack) it runs as I shared > > > > initially which you said is not an issue anymore. But when I add > > > > (--download-scalapack > > > > --download-mumps) or configure with these later, it gives the > > > > following > > > > error: > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > > > ============================================================================================= > > > > Configuring PETSc to compile on your > > > > system > > > > > > > > ================================================================ > > > > == > > > > =========================== > > > > TESTING: FortranMPICheck from > > > > config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > > > details): > > > > > > > > ---------------------------------------------------------------- > > > > -- > > > > ------------- Fortran error! mpi_init() could not be located! > > > > > > > > **************************************************************** > > > > ** > > > > ************* > > > > > > > > What could be the problem here? > > > > > > > > > > Without configure.log we cannot tell what went wrong. However, > > > from the error message, I would guess that your MPI was not built > > > with Fortran bindings. You need these for those packages. > > > > > > Thanks, > > > > > > Matt > > > > > > > > > > Your help is highly appreciated. > > > > > > > > Thank you > > > > Ali > > > > > > > > -----Original Message----- > > > > From: Satish Balay > > > > Sent: Saturday, October 29, 2022 2:11 PM > > > > To: Mohammad Ali Yaqteen > > > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > > > > > > > > > I haven?t accessed PETSC or given any command of my own. I was > > > > > just > > > > installing by following the instructions. I don?t know why it is > > > > attaching the debugger. Although it says ?Possible error running > > > > C/C++ > > > > src/snes/tutorials/ex19 with 1 MPI process? which I think is > > > > indicating of missing of MPI! > > > > > > > > The diff is not smart enough to detect the extra message from > > > > cygwin/OpenMPI - hence it assumes there is a potential problem - > > > > and prints the above message. > > > > > > > > But you can assume its installed properly - and use it. > > > > > > > > Satish > > > > > > > > > > From: Matthew Knepley > > > > > Sent: Friday, October 28, 2022 10:31 PM > > > > > To: Mohammad Ali Yaqteen > > > > > Cc: petsc-users at mcs.anl.gov > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen < > > > > mhyaqteen at sju.ac.kr> wrote: > > > > > Dear Sir, > > > > > > > > > > During the Installation of PETSc in windows, I installed > > > > > Cygwin and the > > > > required libraries as mentioned on your website: > > > > > [cid:image001.png at 01D8EB93.7C17E410] > > > > > However, when I install PETSc using the configure commands > > > > > present on > > > > the petsc website: > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > --download-f2cblaslapack --download-mpich > > > > > > > > > > it gives me the following error: > > > > > > > > > > [cid:image002.png at 01D8EB93.7C17E410] > > > > > > > > > > I already installed OpenMPI using Cygwin installer but it > > > > > still asks me > > > > to. When I configure without ??download-mpich? and run ?make check? > > > > command, it gives me the following errors: > > > > > > > > > > [cid:image003.png at 01D8EB93.7C17E410] > > > > > > > > > > Could you kindly look into this and help me with this? Your > > > > > prompt > > > > response will highly be appreciated. > > > > > > > > > > The runs look fine. > > > > > > > > > > The test should not try to attach the debugger. Do you have > > > > > that in the > > > > PETSC_OPTIONS env variable? > > > > > > > > > > Thanks, > > > > > > > > > > Matt > > > > > > > > > > Thank you! > > > > > Mohammad Ali > > > > > Researcher, Sejong University > > > > > > > > > > > > > > > -- > > > > > What most experimenters take for granted before they begin > > > > > their > > > > experiments is infinitely more interesting than any results to > > > > which their experiments lead. > > > > > -- Norbert Wiener > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/< > > > > http://www.cse.buffalo.edu/~knepley/> > > > > > > > > > > > > > > > > > > > > > From knepley at gmail.com Mon Oct 31 23:26:04 2022 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 1 Nov 2022 00:26:04 -0400 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2db12320-25ab-7911-4bb6-ff0195f5ffdc@mcs.anl.gov> <461d2b54-173d-95fa-6ad5-9ce81849871e@mcs.anl.gov> Message-ID: On Tue, Nov 1, 2022 at 12:16 AM Mohammad Ali Yaqteen wrote: > I am unable to attach the configure.log file. Hence. I have copied the > following text after executing the command (less configure.log) in the > cygwin64 > You can see at the end of the file that your "mpicc" does not work. The link is broken, possibly because you moved directories after you installed it. Thanks, Matt > Executing: uname -s > stdout: CYGWIN_NT-10.0-19044 > > ============================================================================================= > Configuring PETSc to compile on your system > > ============================================================================================= > > > ================================================================================ > > ================================================================================ > Starting configure run at Tue, 01 Nov 2022 13:06:06 +0900 > Configure Options: --configModules=PETSc.Configure > --optionsModule=config.compilerOptions --with-cc=mpicc --with-cxx=mpicxx > --with-fc=mpif90 > Working directory: /home/SEJONG/petsc-3.18.1 > Machine platform: > uname_result(system='CYGWIN_NT-10.0-19044', node='DESKTOP-R1C768B', > release='3.3.6-341.x86_64', version='2022-09-05 11:15 UTC', > machine='x86_64') > Python version: > 3.9.10 (main, Jan 20 2022, 21:37:52) > [GCC 11.2.0] > > ================================================================================ > Environmental variables > USERDOMAIN=DESKTOP-R1C768B > OS=Windows_NT > COMMONPROGRAMFILES=C:\Program Files\Common Files > PROCESSOR_LEVEL=6 > PSModulePath=C:\Users\SEJONG\Documents\WindowsPowerShell\Modules;C:\Program > Files\WindowsPowerShell\Modules;C:\Windows\system32\WindowsPowerShell\v1.0\Modules > CommonProgramW6432=C:\Program Files\Common Files > CommonProgramFiles(x86)=C:\Program Files (x86)\Common Files > LANG=en_US.UTF-8 > TZ=Asia/Seoul > HOSTNAME=DESKTOP-R1C768B > PUBLIC=C:\Users\Public > OLDPWD=/home/SEJONG > USERNAME=SEJONG > LOGONSERVER=\\DESKTOP-R1C768B > PROCESSOR_ARCHITECTURE=AMD64 > LOCALAPPDATA=C:\Users\SEJONG\AppData\Local > COMPUTERNAME=DESKTOP-R1C768B > USER=SEJONG > !::=::\ > SYSTEMDRIVE=C: > USERPROFILE=C:\Users\SEJONG > PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.CPL > SYSTEMROOT=C:\Windows > USERDOMAIN_ROAMINGPROFILE=DESKTOP-R1C768B > OneDriveCommercial=C:\Users\SEJONG\OneDrive - Sejong University > PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 165 Stepping 5, GenuineIntel > GNUPLOT_LIB=C:\Program Files\gnuplot\demo;C:\Program > Files\gnuplot\demo\games;C:\Program Files\gnuplot\share > PWD=/home/SEJONG/petsc-3.18.1 > MSMPI_BIN=C:\Program Files\Microsoft MPI\Bin\ > HOME=/home/SEJONG > TMP=/tmp > OneDrive=C:\Users\SEJONG\OneDrive - Sejong University > ZES_ENABLE_SYSMAN=1 > !C:=C:\cygwin64\bin > PROCESSOR_REVISION=a505 > PROFILEREAD=true > PROMPT=$P$G > NUMBER_OF_PROCESSORS=16 > ProgramW6432=C:\Program Files > COMSPEC=C:\Windows\system32\cmd.exe > APPDATA=C:\Users\SEJONG\AppData\Roaming > SHELL=/bin/bash > TERM=xterm-256color > WINDIR=C:\Windows > ProgramData=C:\ProgramData > SHLVL=1 > PRINTER=\\210.107.220.119\HP Color LaserJet Pro MFP M377 PCL 6 > PROGRAMFILES=C:\Program Files > ALLUSERSPROFILE=C:\ProgramData > TEMP=/tmp > DriverData=C:\Windows\System32\Drivers\DriverData > SESSIONNAME=Console > ProgramFiles(x86)=C:\Program Files (x86) > PATH=/usr/local/bin:/usr/bin:/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program > Files/Microsoft > MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program > Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL > Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client > SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program > Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program > Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program > Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft > VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools:/usr/lib/lapack > PS1=\[\e]0;\w\a\]\n\[\e[32m\]\u@\h \[\e[33m\]\w\[\e[0m\]\n\$ > HOMEDRIVE=C: > INFOPATH=/usr/local/info:/usr/share/info:/usr/info > HOMEPATH=\Users\SEJONG > ORIGINAL_PATH=/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program > Files/Microsoft > MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program > Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL > Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client > SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program > Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program > Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program > Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft > VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools > EXECIGNORE=*.dll > _=./configure > Files in path provided by default path > /usr/local/bin: > /usr/bin: addftinfo.exe addr2line.exe apropos ar.exe arch.exe as.exe > ash.exe awk b2sum.exe base32.exe base64.exe basename.exe basenc.exe > bash.exe bashbug bomtool.exe bunzip2.exe bzcat.exe bzcmp bzdiff bzegrep > bzfgrep bzgrep bzip2.exe bzip2recover.exe bzless bzmore c++.exe c++filt.exe > c89 c99 ca-legacy cal.exe captoinfo cat.exe catman.exe cc ccmake.exe > chattr.exe chcon.exe chgrp.exe chmod.exe chown.exe chroot.exe chrt.exe > cksum.exe clear.exe cmake.exe cmp.exe col.exe colcrt.exe colrm.exe > column.exe comm.exe cp.exe cpack.exe cpp.exe csplit.exe ctest.exe cut.exe > cygarchive-13.dll cygargp-0.dll cygatomic-1.dll cygattr-1.dll > cygblkid-1.dll cygbrotlicommon-1.dll cygbrotlidec-1.dll cygbz2-1.dll > cygcheck.exe cygcom_err-2.dll cygcrypt-2.dll cygcrypto-1.1.dll > cygcurl-4.dll cygdb-5.3.dll cygdb_cxx-5.3.dll cygdb_sql-5.3.dll > cygedit-0.dll cygevent-2-1-7.dll cygevent_core-2-1-7.dll > cygevent_extra-2-1-7.dll cygevent_openssl-2-1-7.dll > cygevent_pthreads-2-1-7.dll cygexpat-1.dll cygfdisk-1.dll cygffi-6.dll > cygfido2-1.dll cygformw-10.dll cyggc-1.dll cyggcc_s-seh-1.dll cyggdbm-6.dll > cyggdbm_compat-4.dll cyggfortran-5.dll cyggmp-10.dll cyggomp-1.dll > cyggsasl-7.dll cyggssapi_krb5-2.dll cygguile-2.2-1.dll cyghistory7.dll > cyghwloc-15.dll cygiconv-2.dll cygidn-12.dll cygidn2-0.dll cygintl-8.dll > cygisl-23.dll cygjsoncpp-25.dll cygk5crypto-3.dll cygkrb5-3.dll > cygkrb5support-0.dll cyglber-2-4-2.dll cyglber-2.dll cygldap-2-4-2.dll > cygldap-2.dll cygldap_r-2-4-2.dll cygltdl-7.dll cyglz4-1.dll cyglzma-5.dll > cyglzo2-2.dll cygmagic-1.dll cygman-2-11-0.dll cygmandb-2-11-0.dll > cygmenuw-10.dll cygmpc-3.dll cygmpfr-6.dll cygmpi-40.dll > cygmpi_mpifh-40.dll cygmpi_usempif08-40.dll cygmpi_usempi_ignore_tkr-40.dll > cygncursesw-10.dll cygnghttp2-14.dll cygntlm-0.dll cygopen-pal-40.dll > cygopen-rte-40.dll cygp11-kit-0.dll cygpanelw-10.dll cygpath.exe > cygpcre2-8-0.dll cygperl5_32.dll cygpipeline-1.dll cygpkgconf-4.dll > cygpopt-0.dll cygpsl-5.dll cygquadmath-0.dll cygreadline7.dll > cygrhash-0.dll cygrunsrv.exe cygsasl2-3.dll cygserver-config > cygsigsegv-2.dll cygsmartcols-1.dll cygsqlite3-0.dll cygssh2-1.dll > cygssl-1.1.dll cygstart.exe cygstdc++-6.dll cygtasn1-6.dll cygticw-10.dll > cygunistring-2.dll cyguuid-1.dll cyguv-1.dll cygwin-console-helper.exe > cygwin1.dll cygxml2-2.dll cygxxhash-0.dll cygz.dll cygzstd-1.dll dash.exe > date.exe dd.exe df.exe diff.exe diff3.exe dir.exe dircolors.exe dirname.exe > dlltool.exe dllwrap.exe dnsdomainname domainname du.exe dumper.exe echo.exe > editrights.exe egrep elfedit.exe env.exe eqn.exe eqn2graph ex expand.exe > expr.exe f95 factor.exe false.exe fgrep fido2-assert.exe fido2-cred.exe > fido2-token.exe file.exe find.exe flock.exe fmt.exe fold.exe g++.exe > gawk-5.1.1.exe gawk.exe gcc-ar.exe gcc-nm.exe gcc-ranlib.exe gcc.exe > gcov-dump.exe gcov-tool.exe gcov.exe gdiffmk gencat.exe getconf.exe > getent.exe getfacl.exe getopt.exe gfortran.exe git-receive-pack.exe > git-shell.exe git-upload-archive.exe git-upload-pack.exe git.exe gkill.exe > gmondump.exe gprof.exe grap2graph grep.exe grn.exe grodvi.exe groff.exe > grolbp.exe grolj4.exe grops.exe grotty.exe groups.exe gunzip gzexe gzip.exe > head.exe hexdump.exe hostid.exe hostname.exe hpftodit.exe > i686-w64-mingw32-pkg-config id.exe indxbib.exe info.exe infocmp.exe > infotocap install-info.exe install.exe ipcmk.exe ipcrm.exe ipcs.exe > isosize.exe join.exe kill.exe lastlog.exe ld.bfd.exe ld.exe ldd.exe ldh.exe > less.exe lessecho.exe lesskey.exe lexgrog.exe libpython3.9.dll > link-cygin.exe lkbib.exe ln.exe locale.exe locate.exe logger.exe login.exe > logname.exe look.exe lookbib.exe ls.exe lsattr.exe lto-dump.exe lzcat lzcmp > lzdiff lzegrep lzfgrep lzgrep lzless lzma lzmadec.exe lzmainfo.exe lzmore > make-dummy-cert make.exe man-recode.exe man.exe mandb.exe manpath.exe > mcookie.exe md5sum.exe minidumper.exe mintheme mintty.exe mkdir.exe > mkfifo.exe mkgroup.exe mknod.exe mkpasswd.exe mkshortcut.exe mktemp.exe > more.exe mount.exe mpic++ mpicc mpicxx mpiexec mpif77 mpif90 mpifort mpirun > mv.exe namei.exe neqn nice.exe nl.exe nm.exe nohup.exe nproc.exe nroff > numfmt.exe objcopy.exe objdump.exe od.exe ompi-clean ompi-server > ompi_info.exe opal_wrapper.exe openssl.exe orte-clean.exe orte-info.exe > orte-server.exe ortecc orted.exe orterun.exe p11-kit.exe passwd.exe > paste.exe pathchk.exe pdfroff peflags.exe peflagsall perl.exe > perl5.32.1.exe pfbtops.exe pg.exe pic.exe pic2graph pinky.exe pip3 pip3.9 > pkg-config pkgconf.exe pldd.exe post-grohtml.exe pr.exe pre-grohtml.exe > preconv.exe printenv.exe printf.exe profiler.exe ps.exe ptx.exe pwd.exe > pydoc3 pydoc3.9 python python3 python3.9.exe pzstd.exe ranlib.exe > readelf.exe readlink.exe readshortcut.exe realpath.exe rebase-trigger > rebase.exe rebaseall rebaselst refer.exe regtool.exe rename.exe > renew-dummy-cert renice.exe reset rev.exe rm.exe rmdir.exe rsync-ssl > rsync.exe run.exe runcon.exe rvi rview scalar.exe scp.exe script.exe > scriptreplay.exe sdiff.exe sed.exe seq.exe setfacl.exe setmetamode.exe > setsid.exe sftp.exe sh.exe sha1sum.exe sha224sum.exe sha256sum.exe > sha384sum.exe sha512sum.exe shred.exe shuf.exe size.exe sleep.exe slogin > soelim.exe sort.exe split.exe ssh-add.exe ssh-agent.exe ssh-copy-id > ssh-host-config ssh-keygen.exe ssh-keyscan.exe ssh-user-config ssh.exe > ssp.exe stat.exe stdbuf.exe strace.exe strings.exe strip.exe stty.exe > sum.exe sync.exe tabs.exe tac.exe tail.exe tar.exe taskset.exe tbl.exe > tee.exe test.exe tfmtodit.exe tic.exe timeout.exe toe.exe touch.exe > tput.exe tr.exe troff.exe true.exe truncate.exe trust.exe tset.exe > tsort.exe tty.exe tzselect tzset.exe ul.exe umount.exe uname.exe > unexpand.exe uniq.exe unlink.exe unlzma unxz unzstd update-ca-trust > update-crypto-policies updatedb users.exe uuidgen.exe uuidparse.exe > vdir.exe vi.exe view wc.exe whatis.exe whereis.exe which.exe who.exe > whoami.exe windmc.exe windres.exe x86_64-pc-cygwin-c++.exe > x86_64-pc-cygwin-g++.exe x86_64-pc-cygwin-gcc-11.exe > x86_64-pc-cygwin-gcc-ar.exe x86_64-pc-cygwin-gcc-nm.exe > x86_64-pc-cygwin-gcc-ranlib.exe x86_64-pc-cygwin-gcc.exe > x86_64-pc-cygwin-gfortran.exe x86_64-pc-cygwin-pkg-config > x86_64-w64-mingw32-pkg-config xargs.exe xmlcatalog.exe xmllint.exe xz.exe > xzcat xzcmp xzdec.exe xzdiff xzegrep xzfgrep xzgrep xzless xzmore yes.exe > zcat zcmp zdiff zdump.exe zegrep zfgrep zforce zgrep zless zmore znew > zstd.exe zstdcat zstdgrep zstdless zstdmt [.exe > /cygdrive/c/SIMULIA/Commands: abaqus.bat abq2018.bat > abq_cae_open.bat abq_odb_open.bat > /cygdrive/c/Program Files/Microsoft MPI/Bin: mpiexec.exe > mpitrace.man smpd.exe > provthrd.dll provtool.exe ProximityCommon.dll ProximityCommonPal.dll > ProximityRtapiPal.dll ProximityService.dll ProximityServicePal.dll > ProximityToast ProximityUxHost.exe prproc.exe prvdmofcomp.dll psapi.dll > pscript.sep PSHED.DLL psisdecd.dll psisrndr.ax PSModuleDis > coveryProvider.dll psmodulediscoveryprovider.mof PsmServiceExtHost.dll > psmsrv.dll psr.exe pstask.dll pstorec.dll pt-BR pt-PT ptpprov.dll > puiapi.dll puiobj.dll PushToInstall.dll pwlauncher.dll pwlauncher.exe > pwrshplugin.dll pwsso.dll qappsrv.exe qasf.dll qcap.dll qdv. > dll qdvd.dll qedit.dll qedwipes.dll qmgr.dll qprocess.exe > QualityUpdateAssistant.dll quartz.dll Query.dll query.exe > QuickActionsDataModel.dll quickassist.exe QuietHours.dll quser.exe > qwave.dll qwinsta.exe RacEngn.dll racpldlg.dll radardt.dll radarrs.dll > RADCUI.dll ra > s rasadhlp.dll rasapi32.dll rasauto.dll rasautou.exe raschap.dll > raschapext.dll rasctrnm.h rasctrs.dll rascustom.dll rasdiag.dll rasdial.exe > rasdlg.dll raserver.exe rasgcw.dll rasman.dll rasmans.dll rasmbmgr.dll > RasMediaManager.dll RASMM.dll rasmontr.dll rasphone.exe > rasplap.dll rasppp.dll rastapi.dll rastls.dll rastlsext.dll RasToast > rdbui.dll rdpbase.dll rdpcfgex.dll rdpclip.exe rdpcore.dll rdpcorets.dll > rdpcredentialprovider.dll rdpencom.dll rdpendp.dll rdpinit.exe rdpinput.exe > rdpnano.dll RdpRelayTransport.dll RdpSa.exe RdpS > aProxy.exe RdpSaPs.dll RdpSaUacHelper.exe rdpserverbase.dll > rdpsharercom.dll rdpshell.exe rdpsign.exe rdpudd.dll rdpviewerax.dll > rdrleakdiag.exe RDSAppXHelper.dll rdsdwmdr.dll rdsxvmaudio.dll > rdvvmtransport.dll RDXService.dll RDXTaskFactory.dll ReAgent.dll ReAgentc.e > xe ReAgentTask.dll recdisc.exe recover.exe Recovery recovery.dll > RecoveryDrive.exe refsutil.exe reg.exe regapi.dll RegCtrl.dll regedt32.exe > regidle.dll regini.exe Register-CimProvider.exe regsvc.dll regsvr32.exe > reguwpapi.dll ReInfo.dll rekeywiz.exe relog.exe RelPost > .exe RemoteAppLifetimeManager.exe RemoteAppLifetimeManagerProxyStub.dll > remoteaudioendpoint.dll remotepg.dll RemotePosWorker.exe remotesp.tsp > RemoteSystemToastIcon.contrast-white.png RemoteSystemToastIcon.png > RemoteWipeCSP.dll RemovableMediaProvisioningPlugin.dll Rem > oveDeviceContextHandler.dll RemoveDeviceElevated.dll rendezvousSession.tlb > repair-bde.exe replace.exe ReportingCSP.dll RESAMPLEDMO.DLL ResBParser.dll > reset.exe reseteng.dll ResetEngine.dll ResetEngine.exe ResetEngOnline.dll > resmon.exe ResourceMapper.dll ResourcePolic > yClient.dll ResourcePolicyServer.dll ResPriHMImageList > ResPriHMImageListLowCost ResPriImageList ResPriImageListLowCost > RestartManager.mof RestartManagerUninstall.mof > RestartNowPower_80.contrast-black.png RestartNowPower_80.contrast-white.png > RestartNowPower_80.png Re > startTonight_80.png RestartTonight_80_contrast-black.png > RestartTonight_80_contrast-white.png restore resutils.dll rgb9rast.dll > Ribbons.scr riched20.dll riched32.dll rilproxy.dll RjvMDMConfig.dll > RMActivate.exe RMActivate_isv.exe RMActivate_ssp.exe RMActivate_ssp_isv > .exe RMapi.dll rmclient.dll RmClient.exe RMSRoamingSecurity.dll > rmttpmvscmgrsvr.exe rnr20.dll ro-RO RoamingSecurity.dll Robocopy.exe > rometadata.dll RotMgr.dll ROUTE.EXE RpcEpMap.dll rpchttp.dll RpcNs4.dll > rpcnsh.dll RpcPing.exe rpcrt4.dll RpcRtRemote.dll rpcss.dll rr > installer.exe rsaenh.dll rshx32.dll rsop.msc RstMwEventLogMsg.dll > RstrtMgr.dll rstrui.exe RtCOM64.dll RtDataProc64.dll rtffilt.dll > RtkApi64U.dll RtkAudUService64.exe RtkCfg64.dll rtm.dll rtmcodecs.dll > RTMediaFrame.dll rtmmvrortc.dll rtmpal.dll rtmpltfm.dll rtutils.dl > l RTWorkQ.dll ru-RU RuleBasedDS.dll runas.exe rundll32.exe > runexehelper.exe RunLegacyCPLElevated.exe runonce.exe RuntimeBroker.exe > rwinsta.exe samcli.dll samlib.dll samsrv.dll Samsung sas.dll sbe.dll > sbeio.dll sberes.dll sbservicetrigger.dll sc.exe ScanPlugin.dll sca > nsetting.dll SCardBi.dll SCardDlg.dll SCardSvr.dll ScavengeSpace.xml > scavengeui.dll ScDeviceEnum.dll scecli.dll scesrv.dll schannel.dll > schedcli.dll schedsvc.dll ScheduleTime_80.contrast-black.png > ScheduleTime_80.contrast-white.png ScheduleTime_80.png schtasks.exe sc > ksp.dll scripto.dll ScriptRunner.exe scrnsave.scr scrobj.dll scrptadm.dll > scrrun.dll sdbinst.exe sdchange.exe sdclt.exe sdcpl.dll SDDS.dll > sdengin2.dll SDFHost.dll sdhcinst.dll sdiageng.dll sdiagnhost.exe > sdiagprv.dll sdiagschd.dll sdohlp.dll sdrsvc.dll sdshext.dll S > earch.ProtocolHandler.MAPI2.dll SearchFilterHost.exe SearchFolder.dll > SearchIndexer.exe SearchProtocolHost.exe SebBackgroundManagerPolicy.dll > SecConfig.efi SecEdit.exe sechost.dll secinit.exe seclogon.dll secpol.msc > secproc.dll secproc_isv.dll secproc_ssp.dll secproc > _ssp_isv.dll secur32.dll SecureAssessmentHandlers.dll SecureBootUpdates > securekernel.exe SecureTimeAggregator.dll security.dll > SecurityAndMaintenance.png SecurityAndMaintenance_Alert.png > SecurityAndMaintenance_Error.png SecurityCenterBroker.dll > SecurityCenterBrokerPS > .dll SecurityHealthAgent.dll SecurityHealthHost.exe > SecurityHealthProxyStub.dll SecurityHealthService.exe SecurityHealthSSO.dll > SecurityHealthSystray.exe sedplugins.dll SEMgrPS.dll SEMgrSvc.dll > sendmail.dll Sens.dll SensApi.dll SensorDataService.exe SensorPerformance > Events.dll SensorsApi.dll SensorsClassExtension.dll SensorsCpl.dll > SensorService.dll SensorsNativeApi.dll SensorsNativeApi.V2.dll > SensorsUtilsV2.dll sensrsvc.dll serialui.dll services.exe services.msc > ServicingUAPI.dll serwvdrv.dll SessEnv.dll sessionmsg.exe setbcdlo > cale.dll sethc.exe SetNetworkLocation.dll SetNetworkLocationFlyout.dll > SetProxyCredential.dll setspn.exe SettingMonitor.dll settings.dat > SettingsEnvironment.Desktop.dll SettingsExtensibilityHandlers.dll > SettingsHandlers_Accessibility.dll SettingsHandlers_AnalogShell. > dll SettingsHandlers_AppControl.dll SettingsHandlers_AppExecutionAlias.dll > SettingsHandlers_AssignedAccess.dll SettingsHandlers_Authentication.dll > SettingsHandlers_BackgroundApps.dll SettingsHandlers_BatteryUsage.dll > SettingsHandlers_BrowserDeclutter.dll SettingsHand > lers_CapabilityAccess.dll SettingsHandlers_Clipboard.dll > SettingsHandlers_ClosedCaptioning.dll > SettingsHandlers_ContentDeliveryManager.dll SettingsHandlers_Cortana.dll > SettingsHandlers_Devices.dll SettingsHandlers_Display.dll > SettingsHandlers_Flights.dll SettingsHand > lers_Fonts.dll SettingsHandlers_ForceSync.dll SettingsHandlers_Gaming.dll > SettingsHandlers_Geolocation.dll SettingsHandlers_Gpu.dll > SettingsHandlers_HoloLens_Environment.dll SettingsHandlers_IME.dll > SettingsHandlers_InkingTypingPrivacy.dll SettingsHandlers_InputPerso > nalization.dll SettingsHandlers_Language.dll > SettingsHandlers_ManagePhone.dll SettingsHandlers_Maps.dll > SettingsHandlers_Mouse.dll SettingsHandlers_Notifications.dll > SettingsHandlers_nt.dll SettingsHandlers_OneCore_BatterySaver.dll > SettingsHandlers_OneCore_PowerAndSl > eep.dll SettingsHandlers_OneDriveBackup.dll > SettingsHandlers_OptionalFeatures.dll SettingsHandlers_PCDisplay.dll > SettingsHandlers_Pen.dll SettingsHandlers_QuickActions.dll > SettingsHandlers_Region.dll SettingsHandlers_SharedExperiences_Rome.dll > SettingsHandlers_SIUF.d > ll SettingsHandlers_SpeechPrivacy.dll SettingsHandlers_Startup.dll > SettingsHandlers_StorageSense.dll SettingsHandlers_Troubleshoot.dll > SettingsHandlers_User.dll SettingsHandlers_UserAccount.dll > SettingsHandlers_UserExperience.dll SettingsHandlers_WorkAccess.dll Setti > ngSync.dll SettingSyncCore.dll SettingSyncDownloadHelper.dll > SettingSyncHost.exe setup setupapi.dll setupcl.dll setupcl.exe setupcln.dll > setupetw.dll setupugc.exe setx.exe sfc.dll sfc.exe sfc_os.dll Sgrm > SgrmBroker.exe SgrmEnclave.dll SgrmEnclave_secure.dll SgrmLpac. > exe shacct.dll shacctprofile.dll SharedPCCSP.dll SharedRealitySvc.dll > ShareHost.dll sharemediacpl.dll SHCore.dll shdocvw.dll shell32.dll > ShellAppRuntime.exe ShellCommonCommonProxyStub.dll ShellExperiences > shellstyle.dll shfolder.dll shgina.dll ShiftJIS.uce shimeng.dl > l shimgvw.dll shlwapi.dll shpafact.dll shrpubw.exe shsetup.dll shsvcs.dll > shunimpl.dll shutdown.exe shutdownext.dll shutdownux.dll shwebsvc.dll si-lk > signdrv.dll sigverif.exe SIHClient.exe sihost.exe SimAuth.dll SimCfg.dll > simpdata.tlb sk-SK skci.dll sl-SI slc.dll sl > cext.dll SleepStudy SlideToShutDown.exe slmgr slmgr.vbs slui.exe slwga.dll > SmallRoom.bin SmartCardBackgroundPolicy.dll SmartcardCredentialProvider.dll > SmartCardSimulator.dll smartscreen.exe smartscreenps.dll SMBHelperClass.dll > smbwmiv2.dll SMI SmiEngine.dll smphost.d > ll SmsRouterSvc.dll smss.exe SndVol.exe SndVolSSO.dll SnippingTool.exe > snmpapi.dll snmptrap.exe Snooze_80.contrast-black.png > Snooze_80.contrast-white.png Snooze_80.png socialapis.dll softkbd.dll > softpub.dll sort.exe SortServer2003Compat.dll SortWindows61.dll SortWind > ows62.dll SortWindows64.dll SortWindows6Compat.dll SpaceAgent.exe > spacebridge.dll SpaceControl.dll spaceman.exe SpatialAudioLicenseSrv.exe > SpatializerApo.dll SpatialStore.dll spbcd.dll > SpeakersSystemToastIcon.contrast-white.png SpeakersSystemToastIcon.png > Spectrum.ex > e SpectrumSyncClient.dll Speech SpeechPal.dll Speech_OneCore spfileq.dll > spinf.dll spmpm.dll spnet.dll spool spoolss.dll spoolsv.exe spopk.dll spp > spp.dll sppc.dll sppcext.dll sppcomapi.dll sppcommdlg.dll SppExtComObj.Exe > sppinst.dll sppnp.dll sppobjs.dll sppsvc.exe > sppui sppwinob.dll sppwmi.dll spwinsat.dll spwizeng.dll spwizimg.dll > spwizres.dll spwmp.dll SqlServerSpatial130.dll SqlServerSpatial150.dll > sqlsrv32.dll sqlsrv32.rll sqmapi.dll sr-Latn-RS srchadmin.dll srclient.dll > srcore.dll srdelayed.exe SrEvents.dll SRH.dll srhelp > er.dll srm.dll srmclient.dll srmlib.dll srms-apr-v.dat srms-apr.dat > srms.dat srmscan.dll srmshell.dll srmstormod.dll srmtrace.dll srm_ps.dll > srpapi.dll SrpUxNativeSnapIn.dll srrstr.dll SrTasks.exe sru srumapi.dll > srumsvc.dll srvcli.dll srvsvc.dll srwmi.dll sscore.dll > sscoreext.dll ssdm.dll ssdpapi.dll ssdpsrv.dll sspicli.dll sspisrv.dll > SSShim.dll ssText3d.scr sstpsvc.dll StartTileData.dll Startupscan.dll > StateRepository.Core.dll stclient.dll stdole2.tlb stdole32.tlb sti.dll > sti_ci.dll stobject.dll StorageContextHandler.dll Stor > ageUsage.dll storagewmi.dll storagewmi_passthru.dll stordiag.exe > storewuauth.dll Storprop.dll StorSvc.dll streamci.dll > StringFeedbackEngine.dll StructuredQuery.dll SubRange.uce subst.exe sud.dll > sv-SE SvBannerBackground.png svchost.exe svf.dll svsvc.dll SwitcherDataM > odel.dll swprv.dll sxproxy.dll sxs.dll sxshared.dll sxssrv.dll > sxsstore.dll sxstrace.exe SyncAppvPublishingServer.exe > SyncAppvPublishingServer.vbs SyncCenter.dll SyncController.dll SyncHost.exe > SyncHostps.dll SyncInfrastructure.dll SyncInfrastructureps.dll SyncProxy. > dll Syncreg.dll SyncRes.dll SyncSettings.dll syncutil.dll sysclass.dll > sysdm.cpl SysFxUI.dll sysmain.dll sysmon.ocx sysntfy.dll Sysprep > sysprint.sep sysprtj.sep SysResetErr.exe syssetup.dll systemcpl.dll > SystemEventsBrokerClient.dll SystemEventsBrokerServer.dll syste > minfo.exe SystemPropertiesAdvanced.exe SystemPropertiesComputerName.exe > SystemPropertiesDataExecutionPrevention.exe SystemPropertiesHardware.exe > SystemPropertiesPerformance.exe SystemPropertiesProtection.exe > SystemPropertiesRemote.exe systemreset.exe SystemResetPlatf > orm SystemSettings.DataModel.dll > SystemSettings.DeviceEncryptionHandlers.dll SystemSettings.Handlers.dll > SystemSettings.SettingsExtensibility.dll > SystemSettings.UserAccountsHandlers.dll SystemSettingsAdminFlows.exe > SystemSettingsBroker.exe SystemSettingsRemoveDevice. > exe SystemSettingsThresholdAdminFlowUI.dll SystemSupportInfo.dll > SystemUWPLauncher.exe systray.exe t2embed.dll ta-in ta-lk Tabbtn.dll > TabbtnEx.dll tabcal.exe TabletPC.cpl TabSvc.dll takeown.exe tapi3.dll > tapi32.dll tapilua.dll TapiMigPlugin.dll tapiperf.dll tapisrv.d > ll TapiSysprep.dll tapiui.dll TapiUnattend.exe tar.exe TaskApis.dll > taskbarcpl.dll taskcomp.dll TaskFlowDataEngine.dll taskhostw.exe > taskkill.exe tasklist.exe Taskmgr.exe Tasks taskschd.dll taskschd.msc > TaskSchdPS.dll tbauth.dll tbs.dll tcblaunch.exe tcbloader.dll tc > msetup.exe tcpbidi.xml tcpipcfg.dll tcpmib.dll tcpmon.dll tcpmon.ini > tcpmonui.dll TCPSVCS.EXE tdc.ocx tdh.dll TDLMigration.dll > TEEManagement64.dll telephon.cpl TelephonyInteractiveUser.dll > TelephonyInteractiveUserRes.dll tellib.dll > TempSignedLicenseExchangeTask.dll T > enantRestrictionsPlugin.dll termmgr.dll termsrv.dll tetheringclient.dll > tetheringconfigsp.dll TetheringIeProvider.dll TetheringMgr.dll > tetheringservice.dll TetheringStation.dll TextInputFramework.dll > TextInputMethodFormatter.dll TextShaping.dll th-TH themecpl.dll The > mes.SsfDownload.ScheduledTask.dll themeservice.dll themeui.dll > ThirdPartyNoticesBySHS.txt threadpoolwinrt.dll thumbcache.dll > ThumbnailExtractionHost.exe ti-et tier2punctuations.dll > TieringEngineProxy.dll TieringEngineService.exe TileDataRepository.dll > TimeBrokerClien > t.dll TimeBrokerServer.dll timedate.cpl TimeDateMUICallback.dll > timeout.exe timesync.dll TimeSyncTask.dll TKCtrl2k64.sys TKFsAv64.sys > TKFsFt64.sys TKFWFV.inf TKFWFV64.cat TKFWFV64.sys tkfwvt64.sys > TKIdsVt64.sys TKPcFtCb64.sys TKPcFtCb64.sys_ TKPcFtHk64.sys TKRgAc2k64 > .sys TKRgFtXp64.sys TKTool2k.sys TKTool2k64.sys tlscsp.dll > tokenbinding.dll TokenBroker.dll TokenBrokerCookies.exe TokenBrokerUI.dll > tpm.msc TpmCertResources.dll tpmcompc.dll TpmCoreProvisioning.dll > TpmInit.exe TpmTasks.dll TpmTool.exe tpmvsc.dll tpmvscmgr.exe tpmvsc > mgrsvr.exe tquery.dll tr-TR tracerpt.exe TRACERT.EXE traffic.dll > TransformPPSToWlan.xslt TransformPPSToWlanCredentials.xslt > TransliterationRanker.dll TransportDSA.dll tree.com trie.dll trkwks.dll > TrustedSignalCredProv.dll tsbyuv.dll tscfgwmi.dll tscon.exe tsdiscon.ex > e TSErrRedir.dll tsf3gip.dll tsgqec.dll tskill.exe tsmf.dll TSpkg.dll > tspubwmi.dll TSSessionUX.dll tssrvlic.dll TSTheme.exe > TsUsbGDCoInstaller.dll TsUsbRedirectionGroupPolicyExtension.dll > TSWbPrxy.exe TSWorkspace.dll TsWpfWrp.exe ttdinject.exe ttdloader.dll > ttdplm.dl > l ttdrecord.dll ttdrecordcpu.dll TtlsAuth.dll TtlsCfg.dll TtlsExt.dll > tttracer.exe tvratings.dll twext.dll twinapi.appcore.dll twinapi.dll > twinui.appcore.dll twinui.dll twinui.pcshell.dll txflog.dll txfw32.dll > typeperf.exe tzautoupdate.dll tzres.dll tzsync.exe tzsync > res.dll tzutil.exe ubpm.dll ucmhc.dll ucrtbase.dll ucrtbased.dll > ucrtbase_clr0400.dll ucrtbase_enclave.dll ucsvc.exe udhisapi.dll uDWM.dll > UefiCsp.dll UevAgentPolicyGenerator.exe UevAppMonitor.exe > UevAppMonitor.exe.config UevCustomActionTypes.tlb UevTemplateBaselineG > enerator.exe UevTemplateConfigItemGenerator.exe uexfat.dll ufat.dll > UiaManager.dll UIAnimation.dll UIAutomationCore.dll uicom.dll > UIManagerBrokerps.dll UIMgrBroker.exe uireng.dll UIRibbon.dll > UIRibbonRes.dll uk-UA ulib.dll umb.dll umdmxfrm.dll umpdc.dll umpnpmgr.dll > umpo-overrides.dll umpo.dll umpoext.dll umpowmi.dll umrdp.dll unattend.dll > unenrollhook.dll unimdm.tsp unimdmat.dll uniplat.dll Unistore.dll > unlodctr.exe UNP unregmp2.exe untfs.dll UpdateAgent.dll updatecsp.dll > UpdateDeploymentProvider.dll UpdateHeartbeat.dll updatep > olicy.dll upfc.exe UpgradeResultsUI.exe upnp.dll upnpcont.exe upnphost.dll > UPPrinterInstaller.exe UPPrinterInstallsCSP.dll upshared.dll uReFS.dll > uReFSv1.dll ureg.dll url.dll urlmon.dll UsbCApi.dll usbceip.dll usbmon.dll > usbperf.dll UsbPmApi.dll UsbSettingsHandlers.d > ll UsbTask.dll usbui.dll user32.dll UserAccountBroker.exe > UserAccountControlSettings.dll UserAccountControlSettings.exe > useractivitybroker.dll usercpl.dll UserDataAccessRes.dll > UserDataAccountApis.dll UserDataLanguageUtil.dll > UserDataPlatformHelperUtil.dll UserDataSe > rvice.dll UserDataTimeUtil.dll UserDataTypeHelperUtil.dll > UserDeviceRegistration.dll UserDeviceRegistration.Ngc.dll userenv.dll > userinit.exe userinitext.dll UserLanguageProfileCallback.dll usermgr.dll > usermgrcli.dll UserMgrProxy.dll usk.rs usoapi.dll UsoClient.exe us > ocoreps.dll usocoreworker.exe usosvc.dll usp10.dll ustprov.dll > UtcDecoderHost.exe UtcManaged.dll utcutil.dll utildll.dll Utilman.exe > uudf.dll UvcModel.dll uwfcfgmgmt.dll uwfcsp.dll uwfservicingapi.dll > UXInit.dll uxlib.dll uxlibres.dll uxtheme.dll vac.dll VAN.dll Vaul > t.dll VaultCDS.dll vaultcli.dll VaultCmd.exe VaultRoaming.dll vaultsvc.dll > VBICodec.ax vbisurf.ax vbsapi.dll vbscript.dll vbssysprep.dll > vcamp120.dll vcamp140.dll vcamp140d.dll VCardParser.dll vccorlib110.dll > vccorlib120.dll vccorlib140.dll vccorlib140d.dll vcomp100. > dll vcomp110.dll vcomp120.dll vcomp140.dll vcomp140d.dll vcruntime140.dll > vcruntime140d.dll vcruntime140_1.dll vcruntime140_1d.dll > vcruntime140_clr0400.dll vds.exe vdsbas.dll vdsdyn.dll vdsldr.exe > vdsutil.dll vdsvd.dll vds_ps.dll verclsid.exe verifier.dll verifier.ex > e verifiergui.exe version.dll vertdll.dll vfbasics.dll vfcompat.dll > vfcuzz.dll vfluapriv.dll vfnet.dll vfntlmless.dll vfnws.dll vfprint.dll > vfprintpthelper.dll vfrdvcompat.dll vfuprov.dll vfwwdm32.dll VhfUm.dll > vid.dll vidcap.ax VideoHandlers.dll VIDRESZR.DLL virtdis > k.dll VirtualMonitorManager.dll VmApplicationHealthMonitorProxy.dll > vmbuspipe.dll vmdevicehost.dll vmictimeprovider.dll vmrdvcore.dll > VocabRoamingHandler.dll VoiceActivationManager.dll VoipRT.dll vpnike.dll > vpnikeapi.dll VpnSohDesktop.dll VPNv2CSP.dll vrfcore.dll Vsc > MgrPS.dll vscover160.dll VSD3DWARPDebug.dll VsGraphicsCapture.dll > VsGraphicsDesktopEngine.exe VsGraphicsExperiment.dll VsGraphicsHelper.dll > VsGraphicsProxyStub.dll VsGraphicsRemoteEngine.exe vsjitdebugger.exe > VSPerf160.dll vssadmin.exe vssapi.dll vsstrace.dll VSSVC.e > xe vss_ps.dll vulkan-1-999-0-0-0.dll vulkan-1.dll > vulkaninfo-1-999-0-0-0.exe vulkaninfo.exe w32time.dll w32tm.exe w32topl.dll > WaaSAssessment.dll WaaSMedicAgent.exe WaaSMedicCapsule.dll WaaSMedicPS.dll > WaaSMedicSvc.dll WABSyncProvider.dll waitfor.exe WalletBackgroundS > erviceProxy.dll WalletProxy.dll WalletService.dll WallpaperHost.exe > wavemsp.dll wbadmin.exe wbem wbemcomn.dll wbengine.exe wbiosrvc.dll wci.dll > wcimage.dll wcmapi.dll wcmcsp.dll wcmsvc.dll WCN WcnApi.dll wcncsvc.dll > WcnEapAuthProxy.dll WcnEapPeerProxy.dll WcnNetsh.dl > l wcnwiz.dll wc_storage.dll wdc.dll WDI wdi.dll wdigest.dll wdmaud.drv > wdscore.dll WdsUnattendTemplate.xml WEB.rs webauthn.dll WebcamUi.dll > webcheck.dll WebClnt.dll webio.dll webplatstorageserver.dll > WebRuntimeManager.dll webservices.dll Websocket.dll wecapi.dll wecs > vc.dll wecutil.exe wephostsvc.dll wer.dll werconcpl.dll wercplsupport.dll > werdiagcontroller.dll WerEnc.dll weretw.dll WerFault.exe WerFaultSecure.exe > wermgr.exe wersvc.dll werui.dll wevtapi.dll wevtfwd.dll wevtsvc.dll > wevtutil.exe wextract.exe WF.msc wfapigp.dll wfdp > rov.dll WFDSConMgr.dll WFDSConMgrSvc.dll WfHC.dll WFS.exe WFSR.dll > whealogr.dll where.exe whhelper.dll whoami.exe wiaacmgr.exe wiaaut.dll > wiadefui.dll wiadss.dll WiaExtensionHost64.dll wiarpc.dll > wiascanprofiles.dll wiaservc.dll wiashext.dll wiatrace.dll wiawow64.exe > WiFiCloudStore.dll WiFiConfigSP.dll wifidatacapabilityhandler.dll > WiFiDisplay.dll wifinetworkmanager.dll wifitask.exe WimBootCompress.ini > wimgapi.dll wimserv.exe win32appinventorycsp.dll > Win32AppSettingsProvider.dll Win32CompatibilityAppraiserCSP.dll win32k.sys > win3 > 2kbase.sys win32kfull.sys win32kns.sys win32spl.dll win32u.dll > Win32_DeviceGuard.dll winbio.dll WinBioDatabase WinBioDataModel.dll > WinBioDataModelOOBE.exe winbioext.dll WinBioPlugIns winbrand.dll > wincorlib.dll wincredprovider.dll wincredui.dll WindowManagement.dll Wi > ndowManagementAPI.dll Windows.AccountsControl.dll > Windows.AI.MachineLearning.dll Windows.AI.MachineLearning.Preview.dll > Windows.ApplicationModel.Background.SystemEventsBroker.dll > Windows.ApplicationModel.Background.TimeBroker.dll > Windows.ApplicationModel.Conversation > alAgent.dll > windows.applicationmodel.conversationalagent.internal.proxystub.dll > windows.applicationmodel.conversationalagent.proxystub.dll > Windows.ApplicationModel.Core.dll windows.applicationmodel.datatransfer.dll > Windows.ApplicationModel.dll Windows.ApplicationMode > l.LockScreen.dll Windows.ApplicationModel.Store.dll > Windows.ApplicationModel.Store.Preview.DOSettings.dll > Windows.ApplicationModel.Store.TestingFramework.dll > Windows.ApplicationModel.Wallet.dll Windows.CloudStore.dll > Windows.CloudStore.Schema.DesktopShell.dll Windows > .CloudStore.Schema.Shell.dll Windows.Cortana.Desktop.dll > Windows.Cortana.OneCore.dll Windows.Cortana.ProxyStub.dll > Windows.Data.Activities.dll Windows.Data.Pdf.dll > Windows.Devices.AllJoyn.dll Windows.Devices.Background.dll > Windows.Devices.Background.ps.dll Windows.De > vices.Bluetooth.dll Windows.Devices.Custom.dll > Windows.Devices.Custom.ps.dll Windows.Devices.Enumeration.dll > Windows.Devices.Haptics.dll Windows.Devices.HumanInterfaceDevice.dll > Windows.Devices.Lights.dll Windows.Devices.LowLevel.dll > Windows.Devices.Midi.dll Windows. > Devices.Perception.dll Windows.Devices.Picker.dll > Windows.Devices.PointOfService.dll Windows.Devices.Portable.dll > Windows.Devices.Printers.dll Windows.Devices.Printers.Extensions.dll > Windows.Devices.Radios.dll Windows.Devices.Scanners.dll > Windows.Devices.Sensors.dll > Windows.Devices.SerialCommunication.dll Windows.Devices.SmartCards.dll > Windows.Devices.SmartCards.Phone.dll Windows.Devices.Usb.dll > Windows.Devices.WiFi.dll Windows.Devices.WiFiDirect.dll Windows.Energy.dll > Windows.FileExplorer.Common.dll Windows.Gaming.Input.dll Win > dows.Gaming.Preview.dll Windows.Gaming.UI.GameBar.dll > Windows.Gaming.XboxLive.Storage.dll Windows.Globalization.dll > Windows.Globalization.Fontgroups.dll > Windows.Globalization.PhoneNumberFormatting.dll > Windows.Graphics.Display.BrightnessOverride.dll Windows.Graphics.D > isplay.DisplayEnhancementOverride.dll Windows.Graphics.dll > Windows.Graphics.Printing.3D.dll Windows.Graphics.Printing.dll > Windows.Graphics.Printing.Workflow.dll > Windows.Graphics.Printing.Workflow.Native.dll Windows.Help.Runtime.dll > windows.immersiveshell.serviceprovi > der.dll Windows.Internal.AdaptiveCards.XamlCardRenderer.dll > Windows.Internal.Bluetooth.dll Windows.Internal.CapturePicker.Desktop.dll > Windows.Internal.CapturePicker.dll Windows.Internal.Devices.Sensors.dll > Windows.Internal.Feedback.Analog.dll Windows.Internal.Feedbac > k.Analog.ProxyStub.dll > Windows.Internal.Graphics.Display.DisplayColorManagement.dll > Windows.Internal.Graphics.Display.DisplayEnhancementManagement.dll > Windows.Internal.Management.dll > Windows.Internal.Management.SecureAssessment.dll > Windows.Internal.PlatformExtension. > DevicePickerExperience.dll > Windows.Internal.PlatformExtension.MiracastBannerExperience.dll > Windows.Internal.PredictionUnit.dll > Windows.Internal.Security.Attestation.DeviceAttestation.dll > Windows.Internal.SecurityMitigationsBroker.dll > Windows.Internal.Shell.Broker.dll > windows.internal.shellcommon.AccountsControlExperience.dll > windows.internal.shellcommon.AppResolverModal.dll > Windows.Internal.ShellCommon.Broker.dll > windows.internal.shellcommon.FilePickerExperienceMEM.dll > Windows.Internal.ShellCommon.PrintExperience.dll windows.int > ernal.shellcommon.shareexperience.dll > windows.internal.shellcommon.TokenBrokerModal.dll > Windows.Internal.Signals.dll Windows.Internal.System.UserProfile.dll > Windows.Internal.Taskbar.dll > Windows.Internal.UI.BioEnrollment.ProxyStub.dll > Windows.Internal.UI.Logon.ProxySt > ub.dll Windows.Internal.UI.Shell.WindowTabManager.dll > Windows.Management.EnrollmentStatusTracking.ConfigProvider.dll > Windows.Management.InprocObjects.dll > Windows.Management.ModernDeployment.ConfigProviders.dll > Windows.Management.Provisioning.ProxyStub.dll Windows.Man > agement.SecureAssessment.CfgProvider.dll > Windows.Management.SecureAssessment.Diagnostics.dll > Windows.Management.Service.dll Windows.Management.Workplace.dll > Windows.Management.Workplace.WorkplaceSettings.dll Windows.Media.Audio.dll > Windows.Media.BackgroundMediaPlayba > ck.dll Windows.Media.BackgroundPlayback.exe Windows.Media.Devices.dll > Windows.Media.dll Windows.Media.Editing.dll Windows.Media.FaceAnalysis.dll > Windows.Media.Import.dll Windows.Media.MediaControl.dll > Windows.Media.MixedRealityCapture.dll Windows.Media.Ocr.dll Window > s.Media.Playback.BackgroundMediaPlayer.dll > Windows.Media.Playback.MediaPlayer.dll Windows.Media.Playback.ProxyStub.dll > Windows.Media.Protection.PlayReady.dll Windows.Media.Renewal.dll > Windows.Media.Speech.dll Windows.Media.Speech.UXRes.dll > Windows.Media.Streaming.dll > Windows.Media.Streaming.ps.dll Windows.Mirage.dll > Windows.Mirage.Internal.Capture.Pipeline.ProxyStub.dll > Windows.Mirage.Internal.dll > Windows.Networking.BackgroundTransfer.BackgroundManagerPolicy.dll > Windows.Networking.BackgroundTransfer.ContentPrefetchTask.dll Windo > ws.Networking.BackgroundTransfer.dll Windows.Networking.Connectivity.dll > Windows.Networking.dll Windows.Networking.HostName.dll > Windows.Networking.NetworkOperators.ESim.dll > Windows.Networking.NetworkOperators.HotspotAuthentication.dll > Windows.Networking.Proximity.dll > Windows.Networking.ServiceDiscovery.Dnssd.dll > Windows.Networking.Sockets.PushEnabledApplication.dll > Windows.Networking.UX.EapRequestHandler.dll Windows.Networking.Vpn.dll > Windows.Networking.XboxLive.ProxyStub.dll Windows.Payments.dll > Windows.Perception.Stub.dll Wind > ows.Security.Authentication.Identity.Provider.dll > Windows.Security.Authentication.OnlineId.dll > Windows.Security.Authentication.Web.Core.dll > Windows.Security.Credentials.UI.CredentialPicker.dll > Windows.Security.Credentials.UI.UserConsentVerifier.dll Windows.Security.I > ntegrity.dll Windows.Services.TargetedContent.dll > Windows.SharedPC.AccountManager.dll Windows.SharedPC.CredentialProvider.dll > Windows.Shell.BlueLightReduction.dll Windows.Shell.ServiceHostBuilder.dll > Windows.Shell.StartLayoutPopulationEvents.dll Windows.StateReposito > ry.dll Windows.StateRepositoryBroker.dll Windows.StateRepositoryClient.dll > Windows.StateRepositoryCore.dll Windows.StateRepositoryPS.dll > Windows.StateRepositoryUpgrade.dll Windows.Storage.ApplicationData.dll > Windows.Storage.Compression.dll windows.storage.dll Windows > .Storage.OneCore.dll Windows.Storage.Search.dll > Windows.System.Diagnostics.dll > Windows.System.Diagnostics.Telemetry.PlatformTelemetryClient.dll > Windows.System.Diagnostics.TraceReporting.PlatformDiagnosticActions.dll > Windows.System.Launcher.dll Windows.System.Profile. > HardwareId.dll > Windows.System.Profile.PlatformDiagnosticsAndUsageDataSettings.dll > Windows.System.Profile.RetailInfo.dll Windows.System.Profile.SystemId.dll > Windows.System.Profile.SystemManufacturers.dll > Windows.System.RemoteDesktop.dll Windows.System.SystemManagement > .dll Windows.System.UserDeviceAssociation.dll > Windows.System.UserProfile.DiagnosticsSettings.dll > Windows.UI.Accessibility.dll Windows.UI.AppDefaults.dll > Windows.UI.BioFeedback.dll Windows.UI.BlockedShutdown.dll > Windows.UI.Core.TextInput.dll Windows.UI.Cred.dll Window > s.UI.CredDialogController.dll Windows.UI.dll Windows.UI.FileExplorer.dll > Windows.UI.Immersive.dll Windows.UI.Input.Inking.Analysis.dll > Windows.UI.Input.Inking.dll Windows.UI.Internal.Input.ExpressiveInput.dll > Windows.UI.Internal.Input.ExpressiveInput.Resource.dll Win > dows.UI.Logon.dll Windows.UI.NetworkUXController.dll > Windows.UI.PicturePassword.dll Windows.UI.Search.dll Windows.UI.Shell.dll > Windows.UI.Shell.Internal.AdaptiveCards.dll Windows.UI.Storage.dll > Windows.UI.Xaml.Controls.dll Windows.UI.Xaml.dll Windows.UI.Xaml.InkContr > ols.dll Windows.UI.Xaml.Maps.dll Windows.UI.Xaml.Phone.dll > Windows.UI.Xaml.Resources.19h1.dll Windows.UI.Xaml.Resources.Common.dll > Windows.UI.Xaml.Resources.rs1.dll Windows.UI.Xaml.Resources.rs2.dll > Windows.UI.Xaml.Resources.rs3.dll Windows.UI.Xaml.Resources.rs4.dll > Windows.UI.Xaml.Resources.rs5.dll Windows.UI.Xaml.Resources.th.dll > Windows.UI.Xaml.Resources.win81.dll Windows.UI.Xaml.Resources.win8rtm.dll > Windows.UI.XamlHost.dll Windows.WARP.JITService.dll > Windows.WARP.JITService.exe Windows.Web.Diagnostics.dll Windows.Web.dll Wi > ndows.Web.Http.dll WindowsActionDialog.exe WindowsCodecs.dll > WindowsCodecsExt.dll WindowsCodecsRaw.dll WindowsCodecsRaw.txt > WindowsDefaultHeatProcessor.dll windowsdefenderapplicationguardcsp.dll > WindowsInternal.ComposableShell.ComposerFramework.dll WindowsInternal.Co > mposableShell.DesktopHosting.dll > WindowsInternal.Shell.CompUiActivation.dll WindowsIoTCsp.dll > windowslivelogin.dll WindowsManagementServiceWinRt.ProxyStub.dll > windowsperformancerecordercontrol.dll WindowsPowerShell > WindowsSecurityIcon.png windowsudk.shellcommon.dll W > indowsUpdateElevatedInstaller.exe winethc.dll winevt WinFax.dll > winhttp.dll winhttpcom.dll WinHvEmulation.dll WinHvPlatform.dll wininet.dll > wininetlui.dll wininit.exe wininitext.dll winipcfile.dll winipcsecproc.dll > winipsec.dll winjson.dll Winlangdb.dll winload.efi w > inload.exe winlogon.exe winlogonext.dll winmde.dll WinMetadata winml.dll > winmm.dll winmmbase.dll winmsipc.dll WinMsoIrmProtector.dll winnlsres.dll > winnsi.dll WinOpcIrmProtector.dll WinREAgent.dll winresume.efi > winresume.exe winrm winrm.cmd winrm.vbs winrnr.dll winrs. > exe winrscmd.dll winrshost.exe winrsmgr.dll winrssrv.dll > WinRTNetMUAHostServer.exe WinRtTracing.dll WinSAT.exe WinSATAPI.dll > WinSCard.dll WinSetupUI.dll winshfhc.dll winsku.dll winsockhc.dll > winspool.drv winsqlite3.dll WINSRPC.DLL winsrv.dll winsrvext.dll winsta.dll > WinSync.dll WinSyncMetastore.dll WinSyncProviders.dll wintrust.dll > WinTypes.dll winusb.dll winver.exe WiredNetworkCSP.dll wisp.dll > witnesswmiv2provider.dll wkscli.dll wkspbroker.exe wkspbrokerAx.dll > wksprt.exe wksprtPS.dll wkssvc.dll wlanapi.dll wlancfg.dll WLanConn. > dll wlandlg.dll wlanext.exe wlangpui.dll WLanHC.dll wlanhlp.dll > WlanMediaManager.dll WlanMM.dll wlanmsm.dll wlanpref.dll > WlanRadioManager.dll wlansec.dll wlansvc.dll wlansvcpal.dll wlanui.dll > wlanutil.dll Wldap32.dll wldp.dll wlgpclnt.dll wlidcli.dll wlidcredprov.dll > wlidfdp.dll wlidnsp.dll wlidprov.dll wlidres.dll wlidsvc.dll wlrmdr.exe > WMADMOD.DLL WMADMOE.DLL WMALFXGFXDSP.dll WMASF.DLL wmcodecdspps.dll > wmdmlog.dll wmdmps.dll wmdrmsdk.dll wmerror.dll wmi.dll wmiclnt.dll > wmicmiplugin.dll wmidcom.dll wmidx.dll WmiMgmt.msc wmiprop > .dll wmitomi.dll WMNetMgr.dll wmp.dll WMPDMC.exe WmpDui.dll wmpdxm.dll > wmpeffects.dll WMPhoto.dll wmploc.DLL wmpps.dll wmpshell.dll wmsgapi.dll > WMSPDMOD.DLL WMSPDMOE.DLL WMVCORE.DLL WMVDECOD.DLL wmvdspa.dll WMVENCOD.DLL > WMVSDECD.DLL WMVSENCD.DLL WMVXENCD.DLL WofTasks > .dll WofUtil.dll WordBreakers.dll WorkFolders.exe WorkfoldersControl.dll > WorkFoldersGPExt.dll WorkFoldersRes.dll WorkFoldersShell.dll > workfolderssvc.dll wosc.dll wow64.dll wow64cpu.dll wow64win.dll > wowreg32.exe WpAXHolder.dll wpbcreds.dll Wpc.dll WpcApi.dll wpcatltoa > st.png WpcDesktopMonSvc.dll WpcMon.exe wpcmon.png WpcProxyStubs.dll > WpcRefreshTask.dll WpcTok.exe WpcWebFilter.dll wpdbusenum.dll WpdMtp.dll > WpdMtpUS.dll wpdshext.dll WPDShextAutoplay.exe WPDShServiceObj.dll > WPDSp.dll wpd_ci.dll wpnapps.dll wpnclient.dll wpncore.dll > wpninprc.dll wpnpinst.exe wpnprv.dll wpnservice.dll wpnsruprov.dll > WpnUserService.dll WpPortingLibrary.dll WppRecorderUM.dll wpr.config.xml > wpr.exe WPTaskScheduler.dll wpx.dll write.exe ws2help.dll ws2_32.dll > wscadminui.exe wscapi.dll wscinterop.dll wscisvif.dll WSCl > ient.dll WSCollect.exe wscproxystub.dll wscript.exe wscsvc.dll wscui.cpl > WSDApi.dll wsdchngr.dll WSDPrintProxy.DLL WsdProviderUtil.dll > WSDScanProxy.dll wsecedit.dll wsepno.dll wshbth.dll wshcon.dll wshelper.dll > wshext.dll wshhyperv.dll wship6.dll wshom.ocx wshqos.dll > wshrm.dll WSHTCPIP.DLL wshunix.dll wsl.exe wslapi.dll WsmAgent.dll > wsmanconfig_schema.xml WSManHTTPConfig.exe WSManMigrationPlugin.dll > WsmAuto.dll wsmplpxy.dll wsmprovhost.exe WsmPty.xsl WsmRes.dll WsmSvc.dll > WsmTxt.xsl WsmWmiPl.dll wsnmp32.dll wsock32.dll wsplib.dl > l wsp_fs.dll wsp_health.dll wsp_sr.dll wsqmcons.exe WSReset.exe > WSTPager.ax wtsapi32.dll wuapi.dll wuapihost.exe wuauclt.exe wuaueng.dll > wuceffects.dll WUDFCoinstaller.dll WUDFCompanionHost.exe WUDFHost.exe > WUDFPlatform.dll WudfSMCClassExt.dll WUDFx.dll WUDFx02000.dl > l wudriver.dll wups.dll wups2.dll wusa.exe wuuhext.dll > wuuhosdeployment.dll wvc.dll WwaApi.dll WwaExt.dll WWAHost.exe WWanAPI.dll > wwancfg.dll wwanconn.dll WWanHC.dll wwanmm.dll Wwanpref.dll wwanprotdim.dll > WwanRadioManager.dll wwansvc.dll wwapi.dll XamlTileRender.dll XAudio2_8.dll > XAudio2_9.dll XblAuthManager.dll XblAuthManagerProxy.dll > XblAuthTokenBrokerExt.dll XblGameSave.dll XblGameSaveExt.dll > XblGameSaveProxy.dll XblGameSaveTask.exe XboxGipRadioManager.dll > xboxgipsvc.dll xboxgipsynthetic.dll XboxNetApiSvc.dll xcopy.exe > XInput1_4.dll XInput9_1_0.dll XInputUap.dll xmlfilter.dll xmllite.dll > xmlprovi.dll xolehlp.dll XpsDocumentTargetPrint.dll XpsGdiConverter.dll > XpsPrint.dll xpspushlayer.dll XpsRasterService.dll xpsservices.dll > XpsToPclmConverter.dll XpsToPwgrConverter.dll xwizard.dtd xwizard.exe > xwizards.dll xwreg.dll xwtpdui.dll xwtpw32.dll X_80.contrast-black.png > X_80.contrast-white.png X_80.png ze_loader.dll ze_tracing_layer.dll > ze_validation_layer.dll zh-CN zh-TW zipcontainer.dll zipfldr.dll > ztrace_maps.dll > /cygdrive/c/Windows: addins AhnInst.log appcompat Application Data > apppatch AppReadiness assembly bcastdvr bfsvc.exe > BitLockerDiscoveryVolumeContents Boot bootstat.dat Branding CbsTemp > Containers CSC Cursors debug diagnostics DiagTrack DigitalLocker Downloaded > Program Files DtcInstall.log ELAMBKUP en-US explorer.exe Fonts > GameBarPresenceWriter gethelp_audiotroubleshooter_latestpackage.zip > Globalization Help HelpPane.exe hh.exe hipiw.dll IdentityCRL > ImageSAFERSvc.exe IME IMGSF50Svc.exe ImmersiveControlPanel INF InputMethod > Installer ko-KR L2Schemas LanguageOverlayCache LiveKernelReports Logs > lsasetup.log Media mib.bin Microsoft.NET Migration ModemLogs notepad.exe > OCR Offline Web Pages Panther Performance PFRO.log PLA PolicyDefinitions > Prefetch PrintDialog Professional.xml Provisioning > regedit.exe Registration RemotePackages rescache Resources RtlExUpd.dll > SchCache schemas security ServiceProfiles ServiceState servicing Setup > setupact.log setuperr.log ShellComponents ShellExperiences SHELLNEW SKB > SoftwareDistribution Speech Speech_OneCore splwow64. > exe System system.ini System32 SystemApps SystemResources SystemTemp > SysWOW64 TAPI Tasks Temp TempInst tracing twain_32 twain_32.dll Vss WaaS > Web win.ini WindowsShell.Manifest WindowsUpdate.log winhlp32.exe WinSxS > WMSysPr9.prx write.exe > /cygdrive/c/Windows/System32/Wbem: aeinv.mof AgentWmi.mof > AgentWmiUninstall.mof appbackgroundtask.dll appbackgroundtask.mof > appbackgroundtask_uninstall.mof AuditRsop.mof authfwcfg.mof AutoRecover > bcd.mof BthMtpEnum.mof cimdmtf.mof cimwin32.dll cimwin32.mof CIWm > i.mof classlog.mof cli.mof cliegaliases.mof ddp.mof dimsjob.mof > dimsroam.mof DMWmiBridgeProv.dll DMWmiBridgeProv.mof DMWmiBridgeProv1.dll > DMWmiBridgeProv1.mof DMWmiBridgeProv1_Uninstall.mof > DMWmiBridgeProv_Uninstall.mof dnsclientcim.dll dnsclientcim.mof > dnsclientpspr > ovider.dll dnsclientpsprovider.mof dnsclientpsprovider_Uninstall.mof > drvinst.mof DscCore.mof DscCoreConfProv.mof dscproxy.mof Dscpspluginwkr.dll > DscTimer.mof dsprov.dll dsprov.mof eaimeapi.mof EmbeddedLockdownWmi.dll > embeddedlockdownwmi.mof embeddedlockdownwmi_Uninst > all.mof en en-US esscli.dll EventTracingManagement.dll > EventTracingManagement.mof fastprox.dll fdPHost.mof fdrespub.mof fdSSDP.mof > fdWNet.mof fdWSD.mof filetrace.mof firewallapi.mof > FolderRedirectionWMIProvider.mof FunDisc.mof fwcfg.mof hbaapi.mof > hnetcfg.mof IMAPIv2 > -Base.mof IMAPIv2-FileSystemSupport.mof IMAPIv2-LegacyShim.mof interop.mof > IpmiDTrc.mof ipmiprr.dll ipmiprv.dll ipmiprv.mof IpmiPTrc.mof ipsecsvc.mof > iscsidsc.mof iscsihba.mof iscsiprf.mof iscsirem.mof iscsiwmiv2.mof > iscsiwmiv2_uninstall.mof kerberos.mof ko ko-KR Krn > lProv.dll krnlprov.mof L2SecHC.mof lltdio.mof lltdsvc.mof Logs lsasrv.mof > mblctr.mof MDMAppProv.dll MDMAppProv.mof MDMAppProv_Uninstall.mof > MDMSettingsProv.dll MDMSettingsProv.mof MDMSettingsProv_Uninstall.mof > Microsoft-Windows-OfflineFiles.mof Microsoft-Windows-Remo > te-FileSystem.mof Microsoft.AppV.AppVClientWmi.dll > Microsoft.AppV.AppVClientWmi.mof Microsoft.Uev.AgentWmi.dll > Microsoft.Uev.ManagedAgentWmi.mof > Microsoft.Uev.ManagedAgentWmiUninstall.mof mispace.mof > mispace_uninstall.mof mmc.mof MMFUtil.dll MOF mofcomp.exe mofd.dll > mofinstall.dll mountmgr.mof mpeval.mof mpsdrv.mof mpssvc.mof msdtcwmi.dll > MsDtcWmi.mof msfeeds.mof msfeedsbs.mof msi.mof msiprov.dll msiscsi.mof > MsNetImPlatform.mof mstsc.mof mstscax.mof msv1_0.mof mswmdm.mof NCProv.dll > ncprov.mof ncsi.mof ndisimplatcim.dll ndistrace > .mof NetAdapterCim.dll NetAdapterCim.mof NetAdapterCimTrace.mof > NetAdapterCimTraceUninstall.mof NetAdapterCim_uninstall.mof netdacim.dll > netdacim.mof netdacim_uninstall.mof NetEventPacketCapture.dll > NetEventPacketCapture.mof NetEventPacketCapture_uninstall.mof netncc > im.dll netnccim.mof netnccim_uninstall.mof NetPeerDistCim.dll > NetPeerDistCim.mof NetPeerDistCim_uninstall.mof netprofm.mof > NetSwitchTeam.mof netswitchteamcim.dll NetTCPIP.dll NetTCPIP.mof > NetTCPIP_Uninstall.mof netttcim.dll netttcim.mof netttcim_uninstall.mof > network > itemfactory.mof newdev.mof nlasvc.mof nlmcim.dll nlmcim.mof > nlmcim_uninstall.mof nlsvc.mof npivwmi.mof nshipsec.mof ntevt.dll ntevt.mof > ntfs.mof OfflineFilesConfigurationWmiProvider.mof > OfflineFilesConfigurationWmiProvider_Uninstall.mof > OfflineFilesWmiProvider.mof Of > flineFilesWmiProvider_Uninstall.mof p2p-mesh.mof p2p-pnrp.mof > pcsvDevice.mof pcsvDevice_Uninstall.mof Performance PNPXAssoc.mof > PolicMan.dll PolicMan.mof polproc.mof polprocl.mof polprou.mof polstore.mof > portabledeviceapi.mof portabledeviceclassextension.mof portable > deviceconnectapi.mof portabledevicetypes.mof portabledevicewiacompat.mof > powermeterprovider.mof PowerPolicyProvider.mof ppcRsopCompSchema.mof > ppcRsopUserSchema.mof PrintFilterPipelineSvc.mof > PrintManagementProvider.dll PrintManagementProvider.mof > PrintManagementProvider_Uninstall.mof profileassociationprovider.mof > PS_MMAgent.mof qmgr.mof qoswmi.dll qoswmi.mof qoswmitrc.mof > qoswmitrc_uninstall.mof qoswmi_uninstall.mof RacWmiProv.dll RacWmiProv.mof > rawxml.xsl rdpendp.mof rdpinit.mof rdpshell.mof refs.mof refsv1.mof > regevent.mof Remove.Microsoft.AppV.AppvClientWmi.mof repdrvfs.dll > Repository rsop.mof rspndr.mof samsrv.mof scersop.mof schannel.mof > schedprov.dll SchedProv.mof scm.mof scrcons.exe scrcons.mof sdbus.mof > secrcw32.mof SensorsClassExtension.mof ServDeps.dll ServiceModel.mof > ServiceModel.mof.uninstall ServiceModel35.mof ServiceModel35.mof.uninstall > services.mof setupapi.mof SmbWitnessWmiv2Provider.mof smbwmiv2.mof > SMTPCons.dll smtpcons.mof sppwmi.mof sr.mof sstpsvc.mof stdprov.dll > storagewmi.mof storagewmi_passthru.mof storagewmi_passthru_uninstall.mof > storagewmi_uninstall.mof stortrace.mof subscrpt.mof system.mof tcpip.mof > texttable.xsl textvaluelist.xsl tmf tsallow.mof tscfgwmi.mof tsmf.mof > tspkg.mof umb.mof umbus.mof umpass.mof umpnpmgr.mof unsecapp.exe > UserProfileConfigurationWmiProvider.mof UserProfileWmiProvider.mof > UserStateWMIProvider.mof vds.mof vdswmi.dll viewprov.dll > vpnclientpsprovider.dll vpnclientpsprovider.mof > vpnclientpsprovider_Uninstall.mof vss.mof vsswmi.dll wbemcntl.dll > wbemcons.dll WBEMCons.mof wbemcore.dll wbemdisp.dll wbemdisp.tlb > wbemess.dll wbemprox.dll wbemsvc.dll wbemtest.exe wcncsvc.mof > WdacEtwProv.mof WdacWmiProv.dll WdacWmiProv.mof WdacWmiProv_Uninstall.mof > Wdf01000.mof Wdf01000Uninstall.mof wdigest.mof WFAPIGP.mof wfascim.dll > wfascim.mof wfascim_uninstall.mof WFP.MOF wfs.mof whqlprov.mof > Win32_DeviceGuard.mof Win32_EncryptableVolume.dll > win32_encryptablevolume.mof Win32_EncryptableVolumeUninstall.mof > win32_printer.mof Win32_Tpm.dll Win32_Tpm.mof wininit.mof winipsec.mof > winlogon.mof WinMgmt.exe WinMgmtR.dll Winsat.mof WinsatUninstall.mof > wlan.mof WLanHC.mof wmi.mof WMIADAP.exe WmiApRes.dll WmiApRpl.dll > WmiApSrv.exe WMIC.exe WMICOOKR.dll WmiDcPrv.dll wmipcima.dll wmipcima.mof > wmipdfs.dll wmipdfs.mof wmipdskq.dll wmipdskq.mof WmiPerfClass.dll > WmiPerfClass.mof WmiPerfInst.dll WmiPerfInst.mof WMIPICMP.dll wmipicmp.mof > WMIPIPRT.dll wmipiprt.mof WMIPJOBJ.dll wmipjobj.mof wmiprov.dll > WmiPrvSD.dll WmiPrvSE.exe WMIPSESS.dll wmipsess.mof WMIsvc.dll wmitimep.dll > wmitimep.mof wmiutils.dll WMI_Tracing.mof wmp.mof wmpnetwk.mof > wpdbusenum.mof wpdcomp.mof wpdfs.mof wpdmtp.mof wpdshext.mof > WPDShServiceObj.mof wpdsp.mof wpd_ci.mof wscenter.mof WsmAgent.mof > WsmAgentUninstall.mof WsmAuto.mof wsp_fs.mof wsp_fs_uninstall.mof > wsp_health.mof wsp_health_uninstall.mof wsp_sr.mof wsp_sr_uninstall.mof > WUDFx.mof Wudfx02000.mof Wudfx02000Uninstall.mof WUDFxUninstall.mof xml > xsl-mappings.xml xwizards.mof > /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0: > Certificate.format.ps1xml Diagnostics.Format.ps1xml > DotNetTypes.format.ps1xml en en-US Event.Format.ps1xml Examples > FileSystem.format.ps1xml getevent.types.ps1xml Help.format.ps1xml > HelpV3.format.ps1xml ko ko-KR Modules powershell.exe powershell.exe.config > PowerShellCore.format.ps1xml PowerShellTrace.format.ps1xml > powershell_ise.exe powershell_ise.exe.config PSEvents.dll pspluginwkr.dll > pwrshmsg.dll pwrshsip.dll Registry.format.ps1xml Schemas SessionConfig > types.ps1xml typesv3.ps1xml WSMan.Format.ps1xml > /cygdrive/c/Windows/System32/OpenSSH: scp.exe sftp.exe ssh-add.exe > ssh-agent.exe ssh-keygen.exe ssh-keyscan.exe ssh.exe > /cygdrive/c/Program Files/MATLAB/R2020b/bin: crash_analyzer.cfg > icutzdata lcdata.xml lcdata.xsd lcdata_utf8.xml m3iregistry matlab.exe > mex.bat mexext.bat util win32 win64 > /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn: > Resources SqlLocalDB.exe > /cygdrive/c/Program Files/Microsoft SQL Server/Client > SDK/ODBC/170/Tools/Binn: batchparser.dll bcp.exe Resources SQLCMD.EXE > xmlrw.dll > /cygdrive/c/Program Files/Git/cmd: git-gui.exe git-lfs.exe git.exe > gitk.exe start-ssh-agent.cmd start-ssh-pageant.cmd > Warning accessing /cygdrive/c/msys64/mingw64/bin gives errors: > [Errno 2] No such file or directory: '/cygdrive/c/msys64/mingw64/bin' > Warning accessing /cygdrive/c/msys64/usr/bin gives errors: [Errno 2] > No such file or directory: '/cygdrive/c/msys64/usr/bin' > /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 > asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll > cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll > clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll > CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll > dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll > HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config > llvm-symbolizer.exe LocalESPC.dll > Microsoft.Diagnostics.Tracing.EventSource.dll > Microsoft.VisualStudio.RemoteControl.dll > Microsoft.VisualStudio.Telemetry.dll > Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll > mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll > mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll > msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll > nmake.exe onecore perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll > pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe > System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe > VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll > vctip.exe xdcmake.exe xdcmake.exe.config > /cygdrive/c/Program Files/dotnet: dotnet.exe host LICENSE.txt packs > sdk shared templates ThirdPartyNotices.txt > /: bin Cygwin-Terminal.ico Cygwin.bat Cygwin.ico dev etc home lib > mpich-4.0.2 mpich-4.0.2.tar.gz sbin tmp usr var proc cygdrive > /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps: Backup > GameBarElevatedFT_Alias.exe Microsoft.DesktopAppInstaller_8wekyb3d8bbwe > Microsoft.MicrosoftEdge_8wekyb3d8bbwe Microsoft.SkypeApp_kzf8qxf38zg5c > Microsoft.XboxGamingOverlay_8wekyb3d8bbwe MicrosoftEdge.exe python.exe > python3.exe Skype.exe winget.exe > /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS > Code/bin: code code.cmd > /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 > asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll > cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll > clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll > CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll > dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll > HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config > llvm-symbolizer.exe LocalESPC.dll > Microsoft.Diagnostics.Tracing.EventSource.dll > Microsoft.VisualStudio.RemoteControl.dll > Microsoft.VisualStudio.Telemetry.dll > Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll > mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll > mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll > msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll > nmake.exe onecore perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll > pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe > System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe > VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll > vctip.exe xdcmake.exe xdcmake.exe.config > Warning accessing /cygdrive/c/Users/SEJONG/.dotnet/tools gives > errors: [Errno 2] No such file or directory: > '/cygdrive/c/Users/SEJONG/.dotnet/tools' > /usr/lib/lapack: cygblas-0.dll cyglapack-0.dll > > ============================================================================================= > TESTING: configureExternalPackagesDir from > config.framework(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py:1045) > Set alternative directory external packages are built in > serialEvaluation: initial cxxDialectRanges ('c++11', 'c++17') > serialEvaluation: new cxxDialectRanges ('c++11', 'c++17') > child config.utilities.macosFirewall took 0.000005 seconds > > ============================================================================================= > TESTING: configureDebuggers from > config.utilities.debuggers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/utilities/debuggers.py:20) > Find a default debugger and determine its arguments > Checking for program /usr/local/bin/gdb...not found > Checking for program /usr/bin/gdb...not found > Checking for program /cygdrive/c/SIMULIA/Commands/gdb...not found > Checking for program /cygdrive/c/Program Files/Microsoft > MPI/Bin/gdb...not found > Checking for program /cygdrive/c/Windows/system32/gdb...not found > Checking for program /cygdrive/c/Windows/gdb...not found > Checking for program /cygdrive/c/Windows/System32/Wbem/gdb...not found > Checking for program > /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/gdb...not found > Checking for program /cygdrive/c/Windows/System32/OpenSSH/gdb...not > found > Checking for program /cygdrive/c/Program > Files/MATLAB/R2020b/bin/gdb...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL > Server/130/Tools/Binn/gdb...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL > Server/Client SDK/ODBC/170/Tools/Binn/gdb...not found > Checking for program /cygdrive/c/Program Files/Git/cmd/gdb...not found > Checking for program /cygdrive/c/msys64/mingw64/bin/gdb...not found > Checking for program /cygdrive/c/msys64/usr/bin/gdb...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not > found > Checking for program /cygdrive/c/Program Files/dotnet/gdb...not found > Checking for program /gdb...not found > Checking for program > /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/gdb...not found > Checking for program > /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS > Code/bin/gdb...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not > found > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/gdb...not > found > Checking for program /usr/lib/lapack/gdb...not found > Checking for program /usr/local/bin/dbx...not found > Checking for program /usr/bin/dbx...not found > Checking for program /cygdrive/c/SIMULIA/Commands/dbx...not found > Checking for program /cygdrive/c/Program Files/Microsoft > MPI/Bin/dbx...not found > Checking for program /cygdrive/c/Windows/system32/dbx...not found > Checking for program /cygdrive/c/Windows/dbx...not found > Checking for program /cygdrive/c/Windows/System32/Wbem/dbx...not found > Checking for program > /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/dbx...not found > Checking for program /cygdrive/c/Windows/System32/OpenSSH/dbx...not > found > Checking for program /cygdrive/c/Program > Files/MATLAB/R2020b/bin/dbx...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL > Server/130/Tools/Binn/dbx...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL > Server/Client SDK/ODBC/170/Tools/Binn/dbx...not found > Checking for program /cygdrive/c/Program Files/Git/cmd/dbx...not found > Checking for program /cygdrive/c/msys64/mingw64/bin/dbx...not found > Checking for program /cygdrive/c/msys64/usr/bin/dbx...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not > found > Checking for program /cygdrive/c/Program Files/dotnet/dbx...not found > Checking for program /dbx...not found > Checking for program > /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/dbx...not found > Checking for program > /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS > Code/bin/dbx...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not > found > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/dbx...not > found > Checking for program /usr/lib/lapack/dbx...not found > Defined make macro "DSYMUTIL" to "true" > child config.utilities.debuggers took 0.014310 seconds > > ============================================================================================= > TESTING: configureDirectories from > PETSc.options.petscdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscdir.py:22) > Checks PETSC_DIR and sets if not set > PETSC_VERSION_RELEASE of 1 indicates the code is from a release > branch or a branch created from a release branch. > Version Information: > #define PETSC_VERSION_RELEASE 1 > #define PETSC_VERSION_MAJOR 3 > #define PETSC_VERSION_MINOR 18 > #define PETSC_VERSION_SUBMINOR 1 > #define PETSC_VERSION_DATE "Oct 26, 2022" > #define PETSC_VERSION_GIT "v3.18.1" > #define PETSC_VERSION_DATE_GIT "2022-10-26 07:57:29 -0500" > #define PETSC_VERSION_EQ(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_ PETSC_VERSION_EQ > #define PETSC_VERSION_LT(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_LE(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_GT(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_GE(MAJOR,MINOR,SUBMINOR) \ > child PETSc.options.petscdir took 0.015510 seconds > > ============================================================================================= > TESTING: getDatafilespath from > PETSc.options.dataFilesPath(/home/SEJONG/petsc-3.18.1/config/PETSc/options/dataFilesPath.py:29) > Checks what DATAFILESPATH should be > child PETSc.options.dataFilesPath took 0.002462 seconds > > ============================================================================================= > TESTING: configureGit from > config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:24) > Find the Git executable > Checking for program /usr/local/bin/git...not found > Checking for program /usr/bin/git...found > Defined make macro "GIT" to "git" > Executing: git --version > stdout: git version 2.38.1 > > ============================================================================================= > TESTING: configureMercurial from > config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:35) > Find the Mercurial executable > Checking for program /usr/local/bin/hg...not found > Checking for program /usr/bin/hg...not found > Checking for program /cygdrive/c/SIMULIA/Commands/hg...not found > Checking for program /cygdrive/c/Program Files/Microsoft > MPI/Bin/hg...not found > Checking for program /cygdrive/c/Windows/system32/hg...not found > Checking for program /cygdrive/c/Windows/hg...not found > Checking for program /cygdrive/c/Windows/System32/Wbem/hg...not found > Checking for program > /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/hg...not found > Checking for program /cygdrive/c/Windows/System32/OpenSSH/hg...not > found > Checking for program /cygdrive/c/Program > Files/MATLAB/R2020b/bin/hg...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL > Server/130/Tools/Binn/hg...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL > Server/Client SDK/ODBC/170/Tools/Binn/hg...not found > Checking for program /cygdrive/c/Program Files/Git/cmd/hg...not found > Checking for program /cygdrive/c/msys64/mingw64/bin/hg...not found > Checking for program /cygdrive/c/msys64/usr/bin/hg...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not > found > Checking for program /cygdrive/c/Program Files/dotnet/hg...not found > Checking for program /hg...not found > Checking for program > /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/hg...not found > Checking for program > /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS > Code/bin/hg...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not > found > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/hg...not > found > Checking for program /usr/lib/lapack/hg...not found > Checking for program > /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/hg...not found > child config.sourceControl took 0.121914 seconds > > ============================================================================================= > TESTING: configureInstallationMethod from > PETSc.options.petscclone(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscclone.py:20) > Determine if PETSc was obtained via git or a tarball > This is a tarball installation > child PETSc.options.petscclone took 0.003125 seconds > > ============================================================================================= > TESTING: setNativeArchitecture from > PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:29) > Forms the arch as GNU's configure would form it > > ============================================================================================= > TESTING: configureArchitecture from > PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:42) > Checks if PETSC_ARCH is set and sets it if not set > No previous hashfile found > Setting hashfile: > arch-mswin-c-debug/lib/petsc/conf/configure-hash > Deleting configure hash file: > arch-mswin-c-debug/lib/petsc/conf/configure-hash > Unable to delete configure hash file: > arch-mswin-c-debug/lib/petsc/conf/configure-hash > child PETSc.options.arch took 0.149094 seconds > > ============================================================================================= > TESTING: setInstallDir from > PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:31) > Set installDir to either prefix or if that is not set to > PETSC_DIR/PETSC_ARCH > Defined make macro "PREFIXDIR" to > "/home/SEJONG/petsc-3.18.1/arch-mswin-c-debug" > > ============================================================================================= > TESTING: saveReconfigure from > PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:76) > Save the configure options in a script in PETSC_ARCH/lib/petsc/conf so > the same configure may be easily re-run > > ============================================================================================= > TESTING: cleanConfDir from > PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:68) > Remove all the files from configuration directory for this PETSC_ARCH, > from --with-clean option > > ============================================================================================= > TESTING: configureInstallDir from > PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:52) > Makes installDir subdirectories if it does not exist for both prefix > install location and PETSc work install location > Changed persistence directory to > /home/SEJONG/petsc-3.18.1/arch-mswin-c-debug/lib/petsc/conf > > TESTING: restoreReconfigure from > PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:90) > If --with-clean was requested but restoring the reconfigure file was > requested then restore it > child PETSc.options.installDir took 0.006476 seconds > > ============================================================================================= > TESTING: setExternalPackagesDir from > PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:15) > Set location where external packages will be downloaded to > > ============================================================================================= > TESTING: cleanExternalpackagesDir from > PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:23) > Remove all downloaded external packages, from --with-clean > child PETSc.options.externalpackagesdir took 0.000990 seconds > > ============================================================================================= > TESTING: configureCLanguage from > PETSc.options.languages(/home/SEJONG/petsc-3.18.1/config/PETSc/options/languages.py:28) > Choose whether to compile the PETSc library using a C or C++ compiler > C language is C > Defined "CLANGUAGE_C" to "1" > Defined make macro "CLANGUAGE" to "C" > child PETSc.options.languages took 0.003172 seconds > > ============================================================================================= > TESTING: resetEnvCompilers from > config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2652) > Remove compilers from the shell environment so they do not interfer with > testing > > ============================================================================================= > TESTING: checkEnvCompilers from > config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2669) > Set configure compilers from the environment, from > -with-environment-variables > > ============================================================================================= > TESTING: checkMPICompilerOverride from > config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2622) > Check if --with-mpi-dir is used along with CC CXX or FC compiler options. > This usually prevents mpi compilers from being used - so issue a > warning > > ============================================================================================= > TESTING: requireMpiLdPath from > config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2643) > OpenMPI wrappers require LD_LIBRARY_PATH set > > ============================================================================================= > TESTING: checkInitialFlags from > config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:723) > Initialize the compiler and linker flags > Initialized CFLAGS to > Initialized CFLAGS to > Initialized LDFLAGS to > Initialized CUDAFLAGS to > Initialized CUDAFLAGS to > Initialized LDFLAGS to > Initialized HIPFLAGS to > Initialized HIPFLAGS to > Initialized LDFLAGS to > Initialized SYCLFLAGS to > Initialized SYCLFLAGS to > Initialized LDFLAGS to > Initialized CXXFLAGS to > Initialized CXX_CXXFLAGS to > Initialized LDFLAGS to > Initialized FFLAGS to > Initialized FFLAGS to > Initialized LDFLAGS to > Initialized CPPFLAGS to > Initialized FPPFLAGS to > Initialized CUDAPPFLAGS to -Wno-deprecated-gpu-targets > Initialized CXXPPFLAGS to > Initialized HIPPPFLAGS to > Initialized SYCLPPFLAGS to > Initialized CC_LINKER_FLAGS to [] > Initialized CXX_LINKER_FLAGS to [] > Initialized FC_LINKER_FLAGS to [] > Initialized CUDAC_LINKER_FLAGS to [] > Initialized HIPC_LINKER_FLAGS to [] > Initialized SYCLC_LINKER_FLAGS to [] > > TESTING: checkCCompiler from > config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:1341) > Locate a functional C compiler > Checking for program /usr/local/bin/mpicc...not found > Checking for program /usr/bin/mpicc...found > Defined make macro "CC" to "mpicc" > Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o > -I/tmp/petsc-uqt11yqc/config.setCompilers > /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c > Successful compile: > Source: > #include "confdefs.h" > #include "conffix.h" > > int main() { > ; > return 0; > } > > Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o > -I/tmp/petsc-uqt11yqc/config.setCompilers > /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c > Successful compile: > Source: > #include "confdefs.h" > #include "conffix.h" > > int main() { > ; > return 0; > } > > Executing: mpicc -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.exe > /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o > Possible ERROR while running linker: exit code 1 > stderr: > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -lhwloc: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -levent_core: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -levent_pthreads: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -lz: No such file or directory > collect2: error: ld returned 1 exit status > Linker output before filtering: > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -lhwloc: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -levent_core: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -levent_pthreads: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -lz: No such file or directory > collect2: error: ld returned 1 exit status > : > Linker output after filtering: > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -lhwloc: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -levent_core: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -levent_pthreads: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -lz: No such file or directory > collect2: error: ld returned 1 exit status: > Error testing C compiler: Cannot compile/link C with mpicc. > MPI compiler wrapper mpicc failed to compile > Executing: mpicc -show > stdout: gcc -L/usr/lib -lmpi -lopen-rte -lopen-pal -lhwloc -levent_core > -levent_pthreads -lz > MPI compiler wrapper mpicc is likely incorrect. > Use --with-mpi-dir to indicate an alternate MPI. > Deleting "CC" > > ******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > details): > > ------------------------------------------------------------------------------- > C compiler you provided with -with-cc=mpicc cannot be found or does not > work. > Cannot compile/link C with mpicc. > > ******************************************************************************* > File "/home/SEJONG/petsc-3.18.1/config/configure.py", line 461, in > petsc_configure > framework.configure(out = sys.stdout) > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", > line 1412, in configure > self.processChildren() > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", > line 1400, in processChildren > self.serialEvaluation(self.childGraph) > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", > line 1375, in serialEvaluation > child.configure() > File > "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line > 2712, in configure > self.executeTest(self.checkCCompiler) > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/base.py", line > 138, in executeTest > ret = test(*args,**kargs) > File > "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line > 1346, in checkCCompiler > for compiler in self.generateCCompilerGuesses(): > File > "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line > 1274, in generateCCompilerGuesses > raise RuntimeError('C compiler you provided with > -with-cc='+self.argDB['with-cc']+' cannot be found or does not > work.'+'\n'+self.mesg) > > ================================================================================ > Finishing configure run at Tue, 01 Nov 2022 13:06:09 +0900 > > -----Original Message----- > From: Satish Balay > Sent: Tuesday, November 1, 2022 11:36 AM > To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: RE: [petsc-users] PETSc Windows Installation > > you'll have to send configure.log for this failure > > Satish > > > On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > > > I have checked the required Cygwin openmpi libraries and they are all > installed. When I run ./configure --with-cc=mpicc --with-cxx=mpicxx > --with-fc=mpif90, it returns: > > > > $ ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > > > ============================================================================================= > > Configuring PETSc to compile on your system > > ====================================================================== > > ======================= > > TESTING: checkCCompiler from > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log > for details): > > ---------------------------------------------------------------------- > > --------- C compiler you provided with -with-cc=mpicc cannot be found > > or does not work. > > Cannot compile/link C with mpicc. > > > > As for the case of WSL2, I will try to install that on my PC. > > Meanwhile, could you please look into this issue > > > > Thank you > > > > Ali > > > > -----Original Message----- > > From: Satish Balay > > Sent: Monday, October 31, 2022 10:56 PM > > To: Satish Balay via petsc-users > > Cc: Matthew Knepley ; Mohammad Ali Yaqteen > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > BTW: If you have WSL2 on windows - it might be easier to build/use PETSc. > > > > Satish > > > > On Mon, 31 Oct 2022, Satish Balay via petsc-users wrote: > > > > > Make sure you have cygwin openmpi installed [and cywin blas/lapack] > > > > > > $ cygcheck -cd |grep openmpi > > > libopenmpi-devel 4.1.2-1 > > > libopenmpi40 4.1.2-1 > > > libopenmpifh40 4.1.2-1 > > > libopenmpiusef08_40 4.1.2-1 > > > libopenmpiusetkr40 4.1.2-1 > > > openmpi 4.1.2-1 > > > $ cygcheck -cd |grep lapack > > > liblapack-devel 3.10.1-1 > > > liblapack0 3.10.1-1 > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > --download-f2cblaslapack > > > > > > Should be: > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > i.e [default] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 [an > > > default cygwin blas/lapack] > > > > > > Satish > > > > > > > > > On Mon, 31 Oct 2022, Matthew Knepley wrote: > > > > > > > On Mon, Oct 31, 2022 at 1:56 AM Mohammad Ali Yaqteen > > > > > > > > wrote: > > > > > > > > > Dear Satish > > > > > > > > > > When I configure PETSc with (./configure --with-cc=gcc > > > > > --with-cxx=0 > > > > > --with-fc=0 --download-f2cblaslapack) it runs as I shared > > > > > initially which you said is not an issue anymore. But when I add > > > > > (--download-scalapack > > > > > --download-mumps) or configure with these later, it gives the > > > > > following > > > > > error: > > > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > > > > > > ============================================================================================= > > > > > Configuring PETSc to compile on your > > > > > system > > > > > > > > > > ================================================================ > > > > > == > > > > > =========================== > > > > > TESTING: FortranMPICheck from > > > > > > config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* > > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see > configure.log for > > > > > details): > > > > > > > > > > ---------------------------------------------------------------- > > > > > -- > > > > > ------------- Fortran error! mpi_init() could not be located! > > > > > > > > > > **************************************************************** > > > > > ** > > > > > ************* > > > > > > > > > > What could be the problem here? > > > > > > > > > > > > > Without configure.log we cannot tell what went wrong. However, > > > > from the error message, I would guess that your MPI was not built > > > > with Fortran bindings. You need these for those packages. > > > > > > > > Thanks, > > > > > > > > Matt > > > > > > > > > > > > > Your help is highly appreciated. > > > > > > > > > > Thank you > > > > > Ali > > > > > > > > > > -----Original Message----- > > > > > From: Satish Balay > > > > > Sent: Saturday, October 29, 2022 2:11 PM > > > > > To: Mohammad Ali Yaqteen > > > > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > > > > > > > > > > > I haven?t accessed PETSC or given any command of my own. I was > > > > > > just > > > > > installing by following the instructions. I don?t know why it is > > > > > attaching the debugger. Although it says ?Possible error running > > > > > C/C++ > > > > > src/snes/tutorials/ex19 with 1 MPI process? which I think is > > > > > indicating of missing of MPI! > > > > > > > > > > The diff is not smart enough to detect the extra message from > > > > > cygwin/OpenMPI - hence it assumes there is a potential problem - > > > > > and prints the above message. > > > > > > > > > > But you can assume its installed properly - and use it. > > > > > > > > > > Satish > > > > > > > > > > > > From: Matthew Knepley > > > > > > Sent: Friday, October 28, 2022 10:31 PM > > > > > > To: Mohammad Ali Yaqteen > > > > > > Cc: petsc-users at mcs.anl.gov > > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen < > > > > > mhyaqteen at sju.ac.kr> wrote: > > > > > > Dear Sir, > > > > > > > > > > > > During the Installation of PETSc in windows, I installed > > > > > > Cygwin and the > > > > > required libraries as mentioned on your website: > > > > > > [cid:image001.png at 01D8EB93.7C17E410] > > > > > > However, when I install PETSc using the configure commands > > > > > > present on > > > > > the petsc website: > > > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > > --download-f2cblaslapack --download-mpich > > > > > > > > > > > > it gives me the following error: > > > > > > > > > > > > [cid:image002.png at 01D8EB93.7C17E410] > > > > > > > > > > > > I already installed OpenMPI using Cygwin installer but it > > > > > > still asks me > > > > > to. When I configure without ??download-mpich? and run ?make check? > > > > > command, it gives me the following errors: > > > > > > > > > > > > [cid:image003.png at 01D8EB93.7C17E410] > > > > > > > > > > > > Could you kindly look into this and help me with this? Your > > > > > > prompt > > > > > response will highly be appreciated. > > > > > > > > > > > > The runs look fine. > > > > > > > > > > > > The test should not try to attach the debugger. Do you have > > > > > > that in the > > > > > PETSC_OPTIONS env variable? > > > > > > > > > > > > Thanks, > > > > > > > > > > > > Matt > > > > > > > > > > > > Thank you! > > > > > > Mohammad Ali > > > > > > Researcher, Sejong University > > > > > > > > > > > > > > > > > > -- > > > > > > What most experimenters take for granted before they begin > > > > > > their > > > > > experiments is infinitely more interesting than any results to > > > > > which their experiments lead. > > > > > > -- Norbert Wiener > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/< > > > > > http://www.cse.buffalo.edu/~knepley/> > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mhyaqteen at sju.ac.kr Mon Oct 31 23:36:03 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Tue, 1 Nov 2022 04:36:03 +0000 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2db12320-25ab-7911-4bb6-ff0195f5ffdc@mcs.anl.gov> <461d2b54-173d-95fa-6ad5-9ce81849871e@mcs.anl.gov> Message-ID: I installed the libraries i.e. Cygwin openmpi in its default folder. I didn?t change anything. Now there is a folder of C:\cygwin64\lib\openmpi\ which includes a file name ?cygompi_dbg_msgq.dll?. Thanks Ali From: Matthew Knepley Sent: Tuesday, November 1, 2022 1:26 PM To: Mohammad Ali Yaqteen Cc: petsc-users Subject: Re: [petsc-users] PETSc Windows Installation On Tue, Nov 1, 2022 at 12:16 AM Mohammad Ali Yaqteen > wrote: I am unable to attach the configure.log file. Hence. I have copied the following text after executing the command (less configure.log) in the cygwin64 You can see at the end of the file that your "mpicc" does not work. The link is broken, possibly because you moved directories after you installed it. Thanks, Matt Executing: uname -s stdout: CYGWIN_NT-10.0-19044 ============================================================================================= Configuring PETSc to compile on your system ============================================================================================= ================================================================================ ================================================================================ Starting configure run at Tue, 01 Nov 2022 13:06:06 +0900 Configure Options: --configModules=PETSc.Configure --optionsModule=config.compilerOptions --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 Working directory: /home/SEJONG/petsc-3.18.1 Machine platform: uname_result(system='CYGWIN_NT-10.0-19044', node='DESKTOP-R1C768B', release='3.3.6-341.x86_64', version='2022-09-05 11:15 UTC', machine='x86_64') Python version: 3.9.10 (main, Jan 20 2022, 21:37:52) [GCC 11.2.0] ================================================================================ Environmental variables USERDOMAIN=DESKTOP-R1C768B OS=Windows_NT COMMONPROGRAMFILES=C:\Program Files\Common Files PROCESSOR_LEVEL=6 PSModulePath=C:\Users\SEJONG\Documents\WindowsPowerShell\Modules;C:\Program Files\WindowsPowerShell\Modules;C:\Windows\system32\WindowsPowerShell\v1.0\Modules CommonProgramW6432=C:\Program Files\Common Files CommonProgramFiles(x86)=C:\Program Files (x86)\Common Files LANG=en_US.UTF-8 TZ=Asia/Seoul HOSTNAME=DESKTOP-R1C768B PUBLIC=C:\Users\Public OLDPWD=/home/SEJONG USERNAME=SEJONG LOGONSERVER=\\DESKTOP-R1C768B PROCESSOR_ARCHITECTURE=AMD64 LOCALAPPDATA=C:\Users\SEJONG\AppData\Local COMPUTERNAME=DESKTOP-R1C768B USER=SEJONG !::=::\ SYSTEMDRIVE=C: USERPROFILE=C:\Users\SEJONG PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.CPL SYSTEMROOT=C:\Windows USERDOMAIN_ROAMINGPROFILE=DESKTOP-R1C768B OneDriveCommercial=C:\Users\SEJONG\OneDrive - Sejong University PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 165 Stepping 5, GenuineIntel GNUPLOT_LIB=C:\Program Files\gnuplot\demo;C:\Program Files\gnuplot\demo\games;C:\Program Files\gnuplot\share PWD=/home/SEJONG/petsc-3.18.1 MSMPI_BIN=C:\Program Files\Microsoft MPI\Bin\ HOME=/home/SEJONG TMP=/tmp OneDrive=C:\Users\SEJONG\OneDrive - Sejong University ZES_ENABLE_SYSMAN=1 !C:=C:\cygwin64\bin PROCESSOR_REVISION=a505 PROFILEREAD=true PROMPT=$P$G NUMBER_OF_PROCESSORS=16 ProgramW6432=C:\Program Files COMSPEC=C:\Windows\system32\cmd.exe APPDATA=C:\Users\SEJONG\AppData\Roaming SHELL=/bin/bash TERM=xterm-256color WINDIR=C:\Windows ProgramData=C:\ProgramData SHLVL=1 PRINTER=\\210.107.220.119\HP Color LaserJet Pro MFP M377 PCL 6 PROGRAMFILES=C:\Program Files ALLUSERSPROFILE=C:\ProgramData TEMP=/tmp DriverData=C:\Windows\System32\Drivers\DriverData SESSIONNAME=Console ProgramFiles(x86)=C:\Program Files (x86) PATH=/usr/local/bin:/usr/bin:/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program Files/Microsoft MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools:/usr/lib/lapack PS1=\[\e]0;\w\a\]\n\[\e[32m\]\u@\h \[\e[33m\]\w\[\e[0m\]\n\$ HOMEDRIVE=C: INFOPATH=/usr/local/info:/usr/share/info:/usr/info HOMEPATH=\Users\SEJONG ORIGINAL_PATH=/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program Files/Microsoft MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools EXECIGNORE=*.dll _=./configure Files in path provided by default path /usr/local/bin: /usr/bin: addftinfo.exe addr2line.exe apropos ar.exe arch.exe as.exe ash.exe awk b2sum.exe base32.exe base64.exe basename.exe basenc.exe bash.exe bashbug bomtool.exe bunzip2.exe bzcat.exe bzcmp bzdiff bzegrep bzfgrep bzgrep bzip2.exe bzip2recover.exe bzless bzmore c++.exe c++filt.exe c89 c99 ca-legacy cal.exe captoinfo cat.exe catman.exe cc ccmake.exe chattr.exe chcon.exe chgrp.exe chmod.exe chown.exe chroot.exe chrt.exe cksum.exe clear.exe cmake.exe cmp.exe col.exe colcrt.exe colrm.exe column.exe comm.exe cp.exe cpack.exe cpp.exe csplit.exe ctest.exe cut.exe cygarchive-13.dll cygargp-0.dll cygatomic-1.dll cygattr-1.dll cygblkid-1.dll cygbrotlicommon-1.dll cygbrotlidec-1.dll cygbz2-1.dll cygcheck.exe cygcom_err-2.dll cygcrypt-2.dll cygcrypto-1.1.dll cygcurl-4.dll cygdb-5.3.dll cygdb_cxx-5.3.dll cygdb_sql-5.3.dll cygedit-0.dll cygevent-2-1-7.dll cygevent_core-2-1-7.dll cygevent_extra-2-1-7.dll cygevent_openssl-2-1-7.dll cygevent_pthreads-2-1-7.dll cygexpat-1.dll cygfdisk-1.dll cygffi-6.dll cygfido2-1.dll cygformw-10.dll cyggc-1.dll cyggcc_s-seh-1.dll cyggdbm-6.dll cyggdbm_compat-4.dll cyggfortran-5.dll cyggmp-10.dll cyggomp-1.dll cyggsasl-7.dll cyggssapi_krb5-2.dll cygguile-2.2-1.dll cyghistory7.dll cyghwloc-15.dll cygiconv-2.dll cygidn-12.dll cygidn2-0.dll cygintl-8.dll cygisl-23.dll cygjsoncpp-25.dll cygk5crypto-3.dll cygkrb5-3.dll cygkrb5support-0.dll cyglber-2-4-2.dll cyglber-2.dll cygldap-2-4-2.dll cygldap-2.dll cygldap_r-2-4-2.dll cygltdl-7.dll cyglz4-1.dll cyglzma-5.dll cyglzo2-2.dll cygmagic-1.dll cygman-2-11-0.dll cygmandb-2-11-0.dll cygmenuw-10.dll cygmpc-3.dll cygmpfr-6.dll cygmpi-40.dll cygmpi_mpifh-40.dll cygmpi_usempif08-40.dll cygmpi_usempi_ignore_tkr-40.dll cygncursesw-10.dll cygnghttp2-14.dll cygntlm-0.dll cygopen-pal-40.dll cygopen-rte-40.dll cygp11-kit-0.dll cygpanelw-10.dll cygpath.exe cygpcre2-8-0.dll cygperl5_32.dll cygpipeline-1.dll cygpkgconf-4.dll cygpopt-0.dll cygpsl-5.dll cygquadmath-0.dll cygreadline7.dll cygrhash-0.dll cygrunsrv.exe cygsasl2-3.dll cygserver-config cygsigsegv-2.dll cygsmartcols-1.dll cygsqlite3-0.dll cygssh2-1.dll cygssl-1.1.dll cygstart.exe cygstdc++-6.dll cygtasn1-6.dll cygticw-10.dll cygunistring-2.dll cyguuid-1.dll cyguv-1.dll cygwin-console-helper.exe cygwin1.dll cygxml2-2.dll cygxxhash-0.dll cygz.dll cygzstd-1.dll dash.exe date.exe dd.exe df.exe diff.exe diff3.exe dir.exe dircolors.exe dirname.exe dlltool.exe dllwrap.exe dnsdomainname domainname du.exe dumper.exe echo.exe editrights.exe egrep elfedit.exe env.exe eqn.exe eqn2graph ex expand.exe expr.exe f95 factor.exe false.exe fgrep fido2-assert.exe fido2-cred.exe fido2-token.exe file.exe find.exe flock.exe fmt.exe fold.exe g++.exe gawk-5.1.1.exe gawk.exe gcc-ar.exe gcc-nm.exe gcc-ranlib.exe gcc.exe gcov-dump.exe gcov-tool.exe gcov.exe gdiffmk gencat.exe getconf.exe getent.exe getfacl.exe getopt.exe gfortran.exe git-receive-pack.exe git-shell.exe git-upload-archive.exe git-upload-pack.exe git.exe gkill.exe gmondump.exe gprof.exe grap2graph grep.exe grn.exe grodvi.exe groff.exe grolbp.exe grolj4.exe grops.exe grotty.exe groups.exe gunzip gzexe gzip.exe head.exe hexdump.exe hostid.exe hostname.exe hpftodit.exe i686-w64-mingw32-pkg-config id.exe indxbib.exe info.exe infocmp.exe infotocap install-info.exe install.exe ipcmk.exe ipcrm.exe ipcs.exe isosize.exe join.exe kill.exe lastlog.exe ld.bfd.exe ld.exe ldd.exe ldh.exe less.exe lessecho.exe lesskey.exe lexgrog.exe libpython3.9.dll link-cygin.exe lkbib.exe ln.exe locale.exe locate.exe logger.exe login.exe logname.exe look.exe lookbib.exe ls.exe lsattr.exe lto-dump.exe lzcat lzcmp lzdiff lzegrep lzfgrep lzgrep lzless lzma lzmadec.exe lzmainfo.exe lzmore make-dummy-cert make.exe man-recode.exe man.exe mandb.exe manpath.exe mcookie.exe md5sum.exe minidumper.exe mintheme mintty.exe mkdir.exe mkfifo.exe mkgroup.exe mknod.exe mkpasswd.exe mkshortcut.exe mktemp.exe more.exe mount.exe mpic++ mpicc mpicxx mpiexec mpif77 mpif90 mpifort mpirun mv.exe namei.exe neqn nice.exe nl.exe nm.exe nohup.exe nproc.exe nroff numfmt.exe objcopy.exe objdump.exe od.exe ompi-clean ompi-server ompi_info.exe opal_wrapper.exe openssl.exe orte-clean.exe orte-info.exe orte-server.exe ortecc orted.exe orterun.exe p11-kit.exe passwd.exe paste.exe pathchk.exe pdfroff peflags.exe peflagsall perl.exe perl5.32.1.exe pfbtops.exe pg.exe pic.exe pic2graph pinky.exe pip3 pip3.9 pkg-config pkgconf.exe pldd.exe post-grohtml.exe pr.exe pre-grohtml.exe preconv.exe printenv.exe printf.exe profiler.exe ps.exe ptx.exe pwd.exe pydoc3 pydoc3.9 python python3 python3.9.exe pzstd.exe ranlib.exe readelf.exe readlink.exe readshortcut.exe realpath.exe rebase-trigger rebase.exe rebaseall rebaselst refer.exe regtool.exe rename.exe renew-dummy-cert renice.exe reset rev.exe rm.exe rmdir.exe rsync-ssl rsync.exe run.exe runcon.exe rvi rview scalar.exe scp.exe script.exe scriptreplay.exe sdiff.exe sed.exe seq.exe setfacl.exe setmetamode.exe setsid.exe sftp.exe sh.exe sha1sum.exe sha224sum.exe sha256sum.exe sha384sum.exe sha512sum.exe shred.exe shuf.exe size.exe sleep.exe slogin soelim.exe sort.exe split.exe ssh-add.exe ssh-agent.exe ssh-copy-id ssh-host-config ssh-keygen.exe ssh-keyscan.exe ssh-user-config ssh.exe ssp.exe stat.exe stdbuf.exe strace.exe strings.exe strip.exe stty.exe sum.exe sync.exe tabs.exe tac.exe tail.exe tar.exe taskset.exe tbl.exe tee.exe test.exe tfmtodit.exe tic.exe timeout.exe toe.exe touch.exe tput.exe tr.exe troff.exe true.exe truncate.exe trust.exe tset.exe tsort.exe tty.exe tzselect tzset.exe ul.exe umount.exe uname.exe unexpand.exe uniq.exe unlink.exe unlzma unxz unzstd update-ca-trust update-crypto-policies updatedb users.exe uuidgen.exe uuidparse.exe vdir.exe vi.exe view wc.exe whatis.exe whereis.exe which.exe who.exe whoami.exe windmc.exe windres.exe x86_64-pc-cygwin-c++.exe x86_64-pc-cygwin-g++.exe x86_64-pc-cygwin-gcc-11.exe x86_64-pc-cygwin-gcc-ar.exe x86_64-pc-cygwin-gcc-nm.exe x86_64-pc-cygwin-gcc-ranlib.exe x86_64-pc-cygwin-gcc.exe x86_64-pc-cygwin-gfortran.exe x86_64-pc-cygwin-pkg-config x86_64-w64-mingw32-pkg-config xargs.exe xmlcatalog.exe xmllint.exe xz.exe xzcat xzcmp xzdec.exe xzdiff xzegrep xzfgrep xzgrep xzless xzmore yes.exe zcat zcmp zdiff zdump.exe zegrep zfgrep zforce zgrep zless zmore znew zstd.exe zstdcat zstdgrep zstdless zstdmt [.exe /cygdrive/c/SIMULIA/Commands: abaqus.bat abq2018.bat abq_cae_open.bat abq_odb_open.bat /cygdrive/c/Program Files/Microsoft MPI/Bin: mpiexec.exe mpitrace.man smpd.exe provthrd.dll provtool.exe ProximityCommon.dll ProximityCommonPal.dll ProximityRtapiPal.dll ProximityService.dll ProximityServicePal.dll ProximityToast ProximityUxHost.exe prproc.exe prvdmofcomp.dll psapi.dll pscript.sep PSHED.DLL psisdecd.dll psisrndr.ax PSModuleDis coveryProvider.dll psmodulediscoveryprovider.mof PsmServiceExtHost.dll psmsrv.dll psr.exe pstask.dll pstorec.dll pt-BR pt-PT ptpprov.dll puiapi.dll puiobj.dll PushToInstall.dll pwlauncher.dll pwlauncher.exe pwrshplugin.dll pwsso.dll qappsrv.exe qasf.dll qcap.dll qdv. dll qdvd.dll qedit.dll qedwipes.dll qmgr.dll qprocess.exe QualityUpdateAssistant.dll quartz.dll Query.dll query.exe QuickActionsDataModel.dll quickassist.exe QuietHours.dll quser.exe qwave.dll qwinsta.exe RacEngn.dll racpldlg.dll radardt.dll radarrs.dll RADCUI.dll ra s rasadhlp.dll rasapi32.dll rasauto.dll rasautou.exe raschap.dll raschapext.dll rasctrnm.h rasctrs.dll rascustom.dll rasdiag.dll rasdial.exe rasdlg.dll raserver.exe rasgcw.dll rasman.dll rasmans.dll rasmbmgr.dll RasMediaManager.dll RASMM.dll rasmontr.dll rasphone.exe rasplap.dll rasppp.dll rastapi.dll rastls.dll rastlsext.dll RasToast rdbui.dll rdpbase.dll rdpcfgex.dll rdpclip.exe rdpcore.dll rdpcorets.dll rdpcredentialprovider.dll rdpencom.dll rdpendp.dll rdpinit.exe rdpinput.exe rdpnano.dll RdpRelayTransport.dll RdpSa.exe RdpS aProxy.exe RdpSaPs.dll RdpSaUacHelper.exe rdpserverbase.dll rdpsharercom.dll rdpshell.exe rdpsign.exe rdpudd.dll rdpviewerax.dll rdrleakdiag.exe RDSAppXHelper.dll rdsdwmdr.dll rdsxvmaudio.dll rdvvmtransport.dll RDXService.dll RDXTaskFactory.dll ReAgent.dll ReAgentc.e xe ReAgentTask.dll recdisc.exe recover.exe Recovery recovery.dll RecoveryDrive.exe refsutil.exe reg.exe regapi.dll RegCtrl.dll regedt32.exe regidle.dll regini.exe Register-CimProvider.exe regsvc.dll regsvr32.exe reguwpapi.dll ReInfo.dll rekeywiz.exe relog.exe RelPost .exe RemoteAppLifetimeManager.exe RemoteAppLifetimeManagerProxyStub.dll remoteaudioendpoint.dll remotepg.dll RemotePosWorker.exe remotesp.tsp RemoteSystemToastIcon.contrast-white.png RemoteSystemToastIcon.png RemoteWipeCSP.dll RemovableMediaProvisioningPlugin.dll Rem oveDeviceContextHandler.dll RemoveDeviceElevated.dll rendezvousSession.tlb repair-bde.exe replace.exe ReportingCSP.dll RESAMPLEDMO.DLL ResBParser.dll reset.exe reseteng.dll ResetEngine.dll ResetEngine.exe ResetEngOnline.dll resmon.exe ResourceMapper.dll ResourcePolic yClient.dll ResourcePolicyServer.dll ResPriHMImageList ResPriHMImageListLowCost ResPriImageList ResPriImageListLowCost RestartManager.mof RestartManagerUninstall.mof RestartNowPower_80.contrast-black.png RestartNowPower_80.contrast-white.png RestartNowPower_80.png Re startTonight_80.png RestartTonight_80_contrast-black.png RestartTonight_80_contrast-white.png restore resutils.dll rgb9rast.dll Ribbons.scr riched20.dll riched32.dll rilproxy.dll RjvMDMConfig.dll RMActivate.exe RMActivate_isv.exe RMActivate_ssp.exe RMActivate_ssp_isv .exe RMapi.dll rmclient.dll RmClient.exe RMSRoamingSecurity.dll rmttpmvscmgrsvr.exe rnr20.dll ro-RO RoamingSecurity.dll Robocopy.exe rometadata.dll RotMgr.dll ROUTE.EXE RpcEpMap.dll rpchttp.dll RpcNs4.dll rpcnsh.dll RpcPing.exe rpcrt4.dll RpcRtRemote.dll rpcss.dll rr installer.exe rsaenh.dll rshx32.dll rsop.msc RstMwEventLogMsg.dll RstrtMgr.dll rstrui.exe RtCOM64.dll RtDataProc64.dll rtffilt.dll RtkApi64U.dll RtkAudUService64.exe RtkCfg64.dll rtm.dll rtmcodecs.dll RTMediaFrame.dll rtmmvrortc.dll rtmpal.dll rtmpltfm.dll rtutils.dl l RTWorkQ.dll ru-RU RuleBasedDS.dll runas.exe rundll32.exe runexehelper.exe RunLegacyCPLElevated.exe runonce.exe RuntimeBroker.exe rwinsta.exe samcli.dll samlib.dll samsrv.dll Samsung sas.dll sbe.dll sbeio.dll sberes.dll sbservicetrigger.dll sc.exe ScanPlugin.dll sca nsetting.dll SCardBi.dll SCardDlg.dll SCardSvr.dll ScavengeSpace.xml scavengeui.dll ScDeviceEnum.dll scecli.dll scesrv.dll schannel.dll schedcli.dll schedsvc.dll ScheduleTime_80.contrast-black.png ScheduleTime_80.contrast-white.png ScheduleTime_80.png schtasks.exe sc ksp.dll scripto.dll ScriptRunner.exe scrnsave.scr scrobj.dll scrptadm.dll scrrun.dll sdbinst.exe sdchange.exe sdclt.exe sdcpl.dll SDDS.dll sdengin2.dll SDFHost.dll sdhcinst.dll sdiageng.dll sdiagnhost.exe sdiagprv.dll sdiagschd.dll sdohlp.dll sdrsvc.dll sdshext.dll S earch.ProtocolHandler.MAPI2.dll SearchFilterHost.exe SearchFolder.dll SearchIndexer.exe SearchProtocolHost.exe SebBackgroundManagerPolicy.dll SecConfig.efi SecEdit.exe sechost.dll secinit.exe seclogon.dll secpol.msc secproc.dll secproc_isv.dll secproc_ssp.dll secproc _ssp_isv.dll secur32.dll SecureAssessmentHandlers.dll SecureBootUpdates securekernel.exe SecureTimeAggregator.dll security.dll SecurityAndMaintenance.png SecurityAndMaintenance_Alert.png SecurityAndMaintenance_Error.png SecurityCenterBroker.dll SecurityCenterBrokerPS .dll SecurityHealthAgent.dll SecurityHealthHost.exe SecurityHealthProxyStub.dll SecurityHealthService.exe SecurityHealthSSO.dll SecurityHealthSystray.exe sedplugins.dll SEMgrPS.dll SEMgrSvc.dll sendmail.dll Sens.dll SensApi.dll SensorDataService.exe SensorPerformance Events.dll SensorsApi.dll SensorsClassExtension.dll SensorsCpl.dll SensorService.dll SensorsNativeApi.dll SensorsNativeApi.V2.dll SensorsUtilsV2.dll sensrsvc.dll serialui.dll services.exe services.msc ServicingUAPI.dll serwvdrv.dll SessEnv.dll sessionmsg.exe setbcdlo cale.dll sethc.exe SetNetworkLocation.dll SetNetworkLocationFlyout.dll SetProxyCredential.dll setspn.exe SettingMonitor.dll settings.dat SettingsEnvironment.Desktop.dll SettingsExtensibilityHandlers.dll SettingsHandlers_Accessibility.dll SettingsHandlers_AnalogShell. dll SettingsHandlers_AppControl.dll SettingsHandlers_AppExecutionAlias.dll SettingsHandlers_AssignedAccess.dll SettingsHandlers_Authentication.dll SettingsHandlers_BackgroundApps.dll SettingsHandlers_BatteryUsage.dll SettingsHandlers_BrowserDeclutter.dll SettingsHand lers_CapabilityAccess.dll SettingsHandlers_Clipboard.dll SettingsHandlers_ClosedCaptioning.dll SettingsHandlers_ContentDeliveryManager.dll SettingsHandlers_Cortana.dll SettingsHandlers_Devices.dll SettingsHandlers_Display.dll SettingsHandlers_Flights.dll SettingsHand lers_Fonts.dll SettingsHandlers_ForceSync.dll SettingsHandlers_Gaming.dll SettingsHandlers_Geolocation.dll SettingsHandlers_Gpu.dll SettingsHandlers_HoloLens_Environment.dll SettingsHandlers_IME.dll SettingsHandlers_InkingTypingPrivacy.dll SettingsHandlers_InputPerso nalization.dll SettingsHandlers_Language.dll SettingsHandlers_ManagePhone.dll SettingsHandlers_Maps.dll SettingsHandlers_Mouse.dll SettingsHandlers_Notifications.dll SettingsHandlers_nt.dll SettingsHandlers_OneCore_BatterySaver.dll SettingsHandlers_OneCore_PowerAndSl eep.dll SettingsHandlers_OneDriveBackup.dll SettingsHandlers_OptionalFeatures.dll SettingsHandlers_PCDisplay.dll SettingsHandlers_Pen.dll SettingsHandlers_QuickActions.dll SettingsHandlers_Region.dll SettingsHandlers_SharedExperiences_Rome.dll SettingsHandlers_SIUF.d ll SettingsHandlers_SpeechPrivacy.dll SettingsHandlers_Startup.dll SettingsHandlers_StorageSense.dll SettingsHandlers_Troubleshoot.dll SettingsHandlers_User.dll SettingsHandlers_UserAccount.dll SettingsHandlers_UserExperience.dll SettingsHandlers_WorkAccess.dll Setti ngSync.dll SettingSyncCore.dll SettingSyncDownloadHelper.dll SettingSyncHost.exe setup setupapi.dll setupcl.dll setupcl.exe setupcln.dll setupetw.dll setupugc.exe setx.exe sfc.dll sfc.exe sfc_os.dll Sgrm SgrmBroker.exe SgrmEnclave.dll SgrmEnclave_secure.dll SgrmLpac. exe shacct.dll shacctprofile.dll SharedPCCSP.dll SharedRealitySvc.dll ShareHost.dll sharemediacpl.dll SHCore.dll shdocvw.dll shell32.dll ShellAppRuntime.exe ShellCommonCommonProxyStub.dll ShellExperiences shellstyle.dll shfolder.dll shgina.dll ShiftJIS.uce shimeng.dl l shimgvw.dll shlwapi.dll shpafact.dll shrpubw.exe shsetup.dll shsvcs.dll shunimpl.dll shutdown.exe shutdownext.dll shutdownux.dll shwebsvc.dll si-lk signdrv.dll sigverif.exe SIHClient.exe sihost.exe SimAuth.dll SimCfg.dll simpdata.tlb sk-SK skci.dll sl-SI slc.dll sl cext.dll SleepStudy SlideToShutDown.exe slmgr slmgr.vbs slui.exe slwga.dll SmallRoom.bin SmartCardBackgroundPolicy.dll SmartcardCredentialProvider.dll SmartCardSimulator.dll smartscreen.exe smartscreenps.dll SMBHelperClass.dll smbwmiv2.dll SMI SmiEngine.dll smphost.d ll SmsRouterSvc.dll smss.exe SndVol.exe SndVolSSO.dll SnippingTool.exe snmpapi.dll snmptrap.exe Snooze_80.contrast-black.png Snooze_80.contrast-white.png Snooze_80.png socialapis.dll softkbd.dll softpub.dll sort.exe SortServer2003Compat.dll SortWindows61.dll SortWind ows62.dll SortWindows64.dll SortWindows6Compat.dll SpaceAgent.exe spacebridge.dll SpaceControl.dll spaceman.exe SpatialAudioLicenseSrv.exe SpatializerApo.dll SpatialStore.dll spbcd.dll SpeakersSystemToastIcon.contrast-white.png SpeakersSystemToastIcon.png Spectrum.ex e SpectrumSyncClient.dll Speech SpeechPal.dll Speech_OneCore spfileq.dll spinf.dll spmpm.dll spnet.dll spool spoolss.dll spoolsv.exe spopk.dll spp spp.dll sppc.dll sppcext.dll sppcomapi.dll sppcommdlg.dll SppExtComObj.Exe sppinst.dll sppnp.dll sppobjs.dll sppsvc.exe sppui sppwinob.dll sppwmi.dll spwinsat.dll spwizeng.dll spwizimg.dll spwizres.dll spwmp.dll SqlServerSpatial130.dll SqlServerSpatial150.dll sqlsrv32.dll sqlsrv32.rll sqmapi.dll sr-Latn-RS srchadmin.dll srclient.dll srcore.dll srdelayed.exe SrEvents.dll SRH.dll srhelp er.dll srm.dll srmclient.dll srmlib.dll srms-apr-v.dat srms-apr.dat srms.dat srmscan.dll srmshell.dll srmstormod.dll srmtrace.dll srm_ps.dll srpapi.dll SrpUxNativeSnapIn.dll srrstr.dll SrTasks.exe sru srumapi.dll srumsvc.dll srvcli.dll srvsvc.dll srwmi.dll sscore.dll sscoreext.dll ssdm.dll ssdpapi.dll ssdpsrv.dll sspicli.dll sspisrv.dll SSShim.dll ssText3d.scr sstpsvc.dll StartTileData.dll Startupscan.dll StateRepository.Core.dll stclient.dll stdole2.tlb stdole32.tlb sti.dll sti_ci.dll stobject.dll StorageContextHandler.dll Stor ageUsage.dll storagewmi.dll storagewmi_passthru.dll stordiag.exe storewuauth.dll Storprop.dll StorSvc.dll streamci.dll StringFeedbackEngine.dll StructuredQuery.dll SubRange.uce subst.exe sud.dll sv-SE SvBannerBackground.png svchost.exe svf.dll svsvc.dll SwitcherDataM odel.dll swprv.dll sxproxy.dll sxs.dll sxshared.dll sxssrv.dll sxsstore.dll sxstrace.exe SyncAppvPublishingServer.exe SyncAppvPublishingServer.vbs SyncCenter.dll SyncController.dll SyncHost.exe SyncHostps.dll SyncInfrastructure.dll SyncInfrastructureps.dll SyncProxy. dll Syncreg.dll SyncRes.dll SyncSettings.dll syncutil.dll sysclass.dll sysdm.cpl SysFxUI.dll sysmain.dll sysmon.ocx sysntfy.dll Sysprep sysprint.sep sysprtj.sep SysResetErr.exe syssetup.dll systemcpl.dll SystemEventsBrokerClient.dll SystemEventsBrokerServer.dll syste minfo.exe SystemPropertiesAdvanced.exe SystemPropertiesComputerName.exe SystemPropertiesDataExecutionPrevention.exe SystemPropertiesHardware.exe SystemPropertiesPerformance.exe SystemPropertiesProtection.exe SystemPropertiesRemote.exe systemreset.exe SystemResetPlatf orm SystemSettings.DataModel.dll SystemSettings.DeviceEncryptionHandlers.dll SystemSettings.Handlers.dll SystemSettings.SettingsExtensibility.dll SystemSettings.UserAccountsHandlers.dll SystemSettingsAdminFlows.exe SystemSettingsBroker.exe SystemSettingsRemoveDevice. exe SystemSettingsThresholdAdminFlowUI.dll SystemSupportInfo.dll SystemUWPLauncher.exe systray.exe t2embed.dll ta-in ta-lk Tabbtn.dll TabbtnEx.dll tabcal.exe TabletPC.cpl TabSvc.dll takeown.exe tapi3.dll tapi32.dll tapilua.dll TapiMigPlugin.dll tapiperf.dll tapisrv.d ll TapiSysprep.dll tapiui.dll TapiUnattend.exe tar.exe TaskApis.dll taskbarcpl.dll taskcomp.dll TaskFlowDataEngine.dll taskhostw.exe taskkill.exe tasklist.exe Taskmgr.exe Tasks taskschd.dll taskschd.msc TaskSchdPS.dll tbauth.dll tbs.dll tcblaunch.exe tcbloader.dll tc msetup.exe tcpbidi.xml tcpipcfg.dll tcpmib.dll tcpmon.dll tcpmon.ini tcpmonui.dll TCPSVCS.EXE tdc.ocx tdh.dll TDLMigration.dll TEEManagement64.dll telephon.cpl TelephonyInteractiveUser.dll TelephonyInteractiveUserRes.dll tellib.dll TempSignedLicenseExchangeTask.dll T enantRestrictionsPlugin.dll termmgr.dll termsrv.dll tetheringclient.dll tetheringconfigsp.dll TetheringIeProvider.dll TetheringMgr.dll tetheringservice.dll TetheringStation.dll TextInputFramework.dll TextInputMethodFormatter.dll TextShaping.dll th-TH themecpl.dll The mes.SsfDownload.ScheduledTask.dll themeservice.dll themeui.dll ThirdPartyNoticesBySHS.txt threadpoolwinrt.dll thumbcache.dll ThumbnailExtractionHost.exe ti-et tier2punctuations.dll TieringEngineProxy.dll TieringEngineService.exe TileDataRepository.dll TimeBrokerClien t.dll TimeBrokerServer.dll timedate.cpl TimeDateMUICallback.dll timeout.exe timesync.dll TimeSyncTask.dll TKCtrl2k64.sys TKFsAv64.sys TKFsFt64.sys TKFWFV.inf TKFWFV64.cat TKFWFV64.sys tkfwvt64.sys TKIdsVt64.sys TKPcFtCb64.sys TKPcFtCb64.sys_ TKPcFtHk64.sys TKRgAc2k64 .sys TKRgFtXp64.sys TKTool2k.sys TKTool2k64.sys tlscsp.dll tokenbinding.dll TokenBroker.dll TokenBrokerCookies.exe TokenBrokerUI.dll tpm.msc TpmCertResources.dll tpmcompc.dll TpmCoreProvisioning.dll TpmInit.exe TpmTasks.dll TpmTool.exe tpmvsc.dll tpmvscmgr.exe tpmvsc mgrsvr.exe tquery.dll tr-TR tracerpt.exe TRACERT.EXE traffic.dll TransformPPSToWlan.xslt TransformPPSToWlanCredentials.xslt TransliterationRanker.dll TransportDSA.dll tree.com trie.dll trkwks.dll TrustedSignalCredProv.dll tsbyuv.dll tscfgwmi.dll tscon.exe tsdiscon.ex e TSErrRedir.dll tsf3gip.dll tsgqec.dll tskill.exe tsmf.dll TSpkg.dll tspubwmi.dll TSSessionUX.dll tssrvlic.dll TSTheme.exe TsUsbGDCoInstaller.dll TsUsbRedirectionGroupPolicyExtension.dll TSWbPrxy.exe TSWorkspace.dll TsWpfWrp.exe ttdinject.exe ttdloader.dll ttdplm.dl l ttdrecord.dll ttdrecordcpu.dll TtlsAuth.dll TtlsCfg.dll TtlsExt.dll tttracer.exe tvratings.dll twext.dll twinapi.appcore.dll twinapi.dll twinui.appcore.dll twinui.dll twinui.pcshell.dll txflog.dll txfw32.dll typeperf.exe tzautoupdate.dll tzres.dll tzsync.exe tzsync res.dll tzutil.exe ubpm.dll ucmhc.dll ucrtbase.dll ucrtbased.dll ucrtbase_clr0400.dll ucrtbase_enclave.dll ucsvc.exe udhisapi.dll uDWM.dll UefiCsp.dll UevAgentPolicyGenerator.exe UevAppMonitor.exe UevAppMonitor.exe.config UevCustomActionTypes.tlb UevTemplateBaselineG enerator.exe UevTemplateConfigItemGenerator.exe uexfat.dll ufat.dll UiaManager.dll UIAnimation.dll UIAutomationCore.dll uicom.dll UIManagerBrokerps.dll UIMgrBroker.exe uireng.dll UIRibbon.dll UIRibbonRes.dll uk-UA ulib.dll umb.dll umdmxfrm.dll umpdc.dll umpnpmgr.dll umpo-overrides.dll umpo.dll umpoext.dll umpowmi.dll umrdp.dll unattend.dll unenrollhook.dll unimdm.tsp unimdmat.dll uniplat.dll Unistore.dll unlodctr.exe UNP unregmp2.exe untfs.dll UpdateAgent.dll updatecsp.dll UpdateDeploymentProvider.dll UpdateHeartbeat.dll updatep olicy.dll upfc.exe UpgradeResultsUI.exe upnp.dll upnpcont.exe upnphost.dll UPPrinterInstaller.exe UPPrinterInstallsCSP.dll upshared.dll uReFS.dll uReFSv1.dll ureg.dll url.dll urlmon.dll UsbCApi.dll usbceip.dll usbmon.dll usbperf.dll UsbPmApi.dll UsbSettingsHandlers.d ll UsbTask.dll usbui.dll user32.dll UserAccountBroker.exe UserAccountControlSettings.dll UserAccountControlSettings.exe useractivitybroker.dll usercpl.dll UserDataAccessRes.dll UserDataAccountApis.dll UserDataLanguageUtil.dll UserDataPlatformHelperUtil.dll UserDataSe rvice.dll UserDataTimeUtil.dll UserDataTypeHelperUtil.dll UserDeviceRegistration.dll UserDeviceRegistration.Ngc.dll userenv.dll userinit.exe userinitext.dll UserLanguageProfileCallback.dll usermgr.dll usermgrcli.dll UserMgrProxy.dll usk.rs usoapi.dll UsoClient.exe us ocoreps.dll usocoreworker.exe usosvc.dll usp10.dll ustprov.dll UtcDecoderHost.exe UtcManaged.dll utcutil.dll utildll.dll Utilman.exe uudf.dll UvcModel.dll uwfcfgmgmt.dll uwfcsp.dll uwfservicingapi.dll UXInit.dll uxlib.dll uxlibres.dll uxtheme.dll vac.dll VAN.dll Vaul t.dll VaultCDS.dll vaultcli.dll VaultCmd.exe VaultRoaming.dll vaultsvc.dll VBICodec.ax vbisurf.ax vbsapi.dll vbscript.dll vbssysprep.dll vcamp120.dll vcamp140.dll vcamp140d.dll VCardParser.dll vccorlib110.dll vccorlib120.dll vccorlib140.dll vccorlib140d.dll vcomp100. dll vcomp110.dll vcomp120.dll vcomp140.dll vcomp140d.dll vcruntime140.dll vcruntime140d.dll vcruntime140_1.dll vcruntime140_1d.dll vcruntime140_clr0400.dll vds.exe vdsbas.dll vdsdyn.dll vdsldr.exe vdsutil.dll vdsvd.dll vds_ps.dll verclsid.exe verifier.dll verifier.ex e verifiergui.exe version.dll vertdll.dll vfbasics.dll vfcompat.dll vfcuzz.dll vfluapriv.dll vfnet.dll vfntlmless.dll vfnws.dll vfprint.dll vfprintpthelper.dll vfrdvcompat.dll vfuprov.dll vfwwdm32.dll VhfUm.dll vid.dll vidcap.ax VideoHandlers.dll VIDRESZR.DLL virtdis k.dll VirtualMonitorManager.dll VmApplicationHealthMonitorProxy.dll vmbuspipe.dll vmdevicehost.dll vmictimeprovider.dll vmrdvcore.dll VocabRoamingHandler.dll VoiceActivationManager.dll VoipRT.dll vpnike.dll vpnikeapi.dll VpnSohDesktop.dll VPNv2CSP.dll vrfcore.dll Vsc MgrPS.dll vscover160.dll VSD3DWARPDebug.dll VsGraphicsCapture.dll VsGraphicsDesktopEngine.exe VsGraphicsExperiment.dll VsGraphicsHelper.dll VsGraphicsProxyStub.dll VsGraphicsRemoteEngine.exe vsjitdebugger.exe VSPerf160.dll vssadmin.exe vssapi.dll vsstrace.dll VSSVC.e xe vss_ps.dll vulkan-1-999-0-0-0.dll vulkan-1.dll vulkaninfo-1-999-0-0-0.exe vulkaninfo.exe w32time.dll w32tm.exe w32topl.dll WaaSAssessment.dll WaaSMedicAgent.exe WaaSMedicCapsule.dll WaaSMedicPS.dll WaaSMedicSvc.dll WABSyncProvider.dll waitfor.exe WalletBackgroundS erviceProxy.dll WalletProxy.dll WalletService.dll WallpaperHost.exe wavemsp.dll wbadmin.exe wbem wbemcomn.dll wbengine.exe wbiosrvc.dll wci.dll wcimage.dll wcmapi.dll wcmcsp.dll wcmsvc.dll WCN WcnApi.dll wcncsvc.dll WcnEapAuthProxy.dll WcnEapPeerProxy.dll WcnNetsh.dl l wcnwiz.dll wc_storage.dll wdc.dll WDI wdi.dll wdigest.dll wdmaud.drv wdscore.dll WdsUnattendTemplate.xml WEB.rs webauthn.dll WebcamUi.dll webcheck.dll WebClnt.dll webio.dll webplatstorageserver.dll WebRuntimeManager.dll webservices.dll Websocket.dll wecapi.dll wecs vc.dll wecutil.exe wephostsvc.dll wer.dll werconcpl.dll wercplsupport.dll werdiagcontroller.dll WerEnc.dll weretw.dll WerFault.exe WerFaultSecure.exe wermgr.exe wersvc.dll werui.dll wevtapi.dll wevtfwd.dll wevtsvc.dll wevtutil.exe wextract.exe WF.msc wfapigp.dll wfdp rov.dll WFDSConMgr.dll WFDSConMgrSvc.dll WfHC.dll WFS.exe WFSR.dll whealogr.dll where.exe whhelper.dll whoami.exe wiaacmgr.exe wiaaut.dll wiadefui.dll wiadss.dll WiaExtensionHost64.dll wiarpc.dll wiascanprofiles.dll wiaservc.dll wiashext.dll wiatrace.dll wiawow64.exe WiFiCloudStore.dll WiFiConfigSP.dll wifidatacapabilityhandler.dll WiFiDisplay.dll wifinetworkmanager.dll wifitask.exe WimBootCompress.ini wimgapi.dll wimserv.exe win32appinventorycsp.dll Win32AppSettingsProvider.dll Win32CompatibilityAppraiserCSP.dll win32k.sys win3 2kbase.sys win32kfull.sys win32kns.sys win32spl.dll win32u.dll Win32_DeviceGuard.dll winbio.dll WinBioDatabase WinBioDataModel.dll WinBioDataModelOOBE.exe winbioext.dll WinBioPlugIns winbrand.dll wincorlib.dll wincredprovider.dll wincredui.dll WindowManagement.dll Wi ndowManagementAPI.dll Windows.AccountsControl.dll Windows.AI.MachineLearning.dll Windows.AI.MachineLearning.Preview.dll Windows.ApplicationModel.Background.SystemEventsBroker.dll Windows.ApplicationModel.Background.TimeBroker.dll Windows.ApplicationModel.Conversation alAgent.dll windows.applicationmodel.conversationalagent.internal.proxystub.dll windows.applicationmodel.conversationalagent.proxystub.dll Windows.ApplicationModel.Core.dll windows.applicationmodel.datatransfer.dll Windows.ApplicationModel.dll Windows.ApplicationMode l.LockScreen.dll Windows.ApplicationModel.Store.dll Windows.ApplicationModel.Store.Preview.DOSettings.dll Windows.ApplicationModel.Store.TestingFramework.dll Windows.ApplicationModel.Wallet.dll Windows.CloudStore.dll Windows.CloudStore.Schema.DesktopShell.dll Windows .CloudStore.Schema.Shell.dll Windows.Cortana.Desktop.dll Windows.Cortana.OneCore.dll Windows.Cortana.ProxyStub.dll Windows.Data.Activities.dll Windows.Data.Pdf.dll Windows.Devices.AllJoyn.dll Windows.Devices.Background.dll Windows.Devices.Background.ps.dll Windows.De vices.Bluetooth.dll Windows.Devices.Custom.dll Windows.Devices.Custom.ps.dll Windows.Devices.Enumeration.dll Windows.Devices.Haptics.dll Windows.Devices.HumanInterfaceDevice.dll Windows.Devices.Lights.dll Windows.Devices.LowLevel.dll Windows.Devices.Midi.dll Windows. Devices.Perception.dll Windows.Devices.Picker.dll Windows.Devices.PointOfService.dll Windows.Devices.Portable.dll Windows.Devices.Printers.dll Windows.Devices.Printers.Extensions.dll Windows.Devices.Radios.dll Windows.Devices.Scanners.dll Windows.Devices.Sensors.dll Windows.Devices.SerialCommunication.dll Windows.Devices.SmartCards.dll Windows.Devices.SmartCards.Phone.dll Windows.Devices.Usb.dll Windows.Devices.WiFi.dll Windows.Devices.WiFiDirect.dll Windows.Energy.dll Windows.FileExplorer.Common.dll Windows.Gaming.Input.dll Win dows.Gaming.Preview.dll Windows.Gaming.UI.GameBar.dll Windows.Gaming.XboxLive.Storage.dll Windows.Globalization.dll Windows.Globalization.Fontgroups.dll Windows.Globalization.PhoneNumberFormatting.dll Windows.Graphics.Display.BrightnessOverride.dll Windows.Graphics.D isplay.DisplayEnhancementOverride.dll Windows.Graphics.dll Windows.Graphics.Printing.3D.dll Windows.Graphics.Printing.dll Windows.Graphics.Printing.Workflow.dll Windows.Graphics.Printing.Workflow.Native.dll Windows.Help.Runtime.dll windows.immersiveshell.serviceprovi der.dll Windows.Internal.AdaptiveCards.XamlCardRenderer.dll Windows.Internal.Bluetooth.dll Windows.Internal.CapturePicker.Desktop.dll Windows.Internal.CapturePicker.dll Windows.Internal.Devices.Sensors.dll Windows.Internal.Feedback.Analog.dll Windows.Internal.Feedbac k.Analog.ProxyStub.dll Windows.Internal.Graphics.Display.DisplayColorManagement.dll Windows.Internal.Graphics.Display.DisplayEnhancementManagement.dll Windows.Internal.Management.dll Windows.Internal.Management.SecureAssessment.dll Windows.Internal.PlatformExtension. DevicePickerExperience.dll Windows.Internal.PlatformExtension.MiracastBannerExperience.dll Windows.Internal.PredictionUnit.dll Windows.Internal.Security.Attestation.DeviceAttestation.dll Windows.Internal.SecurityMitigationsBroker.dll Windows.Internal.Shell.Broker.dll windows.internal.shellcommon.AccountsControlExperience.dll windows.internal.shellcommon.AppResolverModal.dll Windows.Internal.ShellCommon.Broker.dll windows.internal.shellcommon.FilePickerExperienceMEM.dll Windows.Internal.ShellCommon.PrintExperience.dll windows.int ernal.shellcommon.shareexperience.dll windows.internal.shellcommon.TokenBrokerModal.dll Windows.Internal.Signals.dll Windows.Internal.System.UserProfile.dll Windows.Internal.Taskbar.dll Windows.Internal.UI.BioEnrollment.ProxyStub.dll Windows.Internal.UI.Logon.ProxySt ub.dll Windows.Internal.UI.Shell.WindowTabManager.dll Windows.Management.EnrollmentStatusTracking.ConfigProvider.dll Windows.Management.InprocObjects.dll Windows.Management.ModernDeployment.ConfigProviders.dll Windows.Management.Provisioning.ProxyStub.dll Windows.Man agement.SecureAssessment.CfgProvider.dll Windows.Management.SecureAssessment.Diagnostics.dll Windows.Management.Service.dll Windows.Management.Workplace.dll Windows.Management.Workplace.WorkplaceSettings.dll Windows.Media.Audio.dll Windows.Media.BackgroundMediaPlayba ck.dll Windows.Media.BackgroundPlayback.exe Windows.Media.Devices.dll Windows.Media.dll Windows.Media.Editing.dll Windows.Media.FaceAnalysis.dll Windows.Media.Import.dll Windows.Media.MediaControl.dll Windows.Media.MixedRealityCapture.dll Windows.Media.Ocr.dll Window s.Media.Playback.BackgroundMediaPlayer.dll Windows.Media.Playback.MediaPlayer.dll Windows.Media.Playback.ProxyStub.dll Windows.Media.Protection.PlayReady.dll Windows.Media.Renewal.dll Windows.Media.Speech.dll Windows.Media.Speech.UXRes.dll Windows.Media.Streaming.dll Windows.Media.Streaming.ps.dll Windows.Mirage.dll Windows.Mirage.Internal.Capture.Pipeline.ProxyStub.dll Windows.Mirage.Internal.dll Windows.Networking.BackgroundTransfer.BackgroundManagerPolicy.dll Windows.Networking.BackgroundTransfer.ContentPrefetchTask.dll Windo ws.Networking.BackgroundTransfer.dll Windows.Networking.Connectivity.dll Windows.Networking.dll Windows.Networking.HostName.dll Windows.Networking.NetworkOperators.ESim.dll Windows.Networking.NetworkOperators.HotspotAuthentication.dll Windows.Networking.Proximity.dll Windows.Networking.ServiceDiscovery.Dnssd.dll Windows.Networking.Sockets.PushEnabledApplication.dll Windows.Networking.UX.EapRequestHandler.dll Windows.Networking.Vpn.dll Windows.Networking.XboxLive.ProxyStub.dll Windows.Payments.dll Windows.Perception.Stub.dll Wind ows.Security.Authentication.Identity.Provider.dll Windows.Security.Authentication.OnlineId.dll Windows.Security.Authentication.Web.Core.dll Windows.Security.Credentials.UI.CredentialPicker.dll Windows.Security.Credentials.UI.UserConsentVerifier.dll Windows.Security.I ntegrity.dll Windows.Services.TargetedContent.dll Windows.SharedPC.AccountManager.dll Windows.SharedPC.CredentialProvider.dll Windows.Shell.BlueLightReduction.dll Windows.Shell.ServiceHostBuilder.dll Windows.Shell.StartLayoutPopulationEvents.dll Windows.StateReposito ry.dll Windows.StateRepositoryBroker.dll Windows.StateRepositoryClient.dll Windows.StateRepositoryCore.dll Windows.StateRepositoryPS.dll Windows.StateRepositoryUpgrade.dll Windows.Storage.ApplicationData.dll Windows.Storage.Compression.dll windows.storage.dll Windows .Storage.OneCore.dll Windows.Storage.Search.dll Windows.System.Diagnostics.dll Windows.System.Diagnostics.Telemetry.PlatformTelemetryClient.dll Windows.System.Diagnostics.TraceReporting.PlatformDiagnosticActions.dll Windows.System.Launcher.dll Windows.System.Profile. HardwareId.dll Windows.System.Profile.PlatformDiagnosticsAndUsageDataSettings.dll Windows.System.Profile.RetailInfo.dll Windows.System.Profile.SystemId.dll Windows.System.Profile.SystemManufacturers.dll Windows.System.RemoteDesktop.dll Windows.System.SystemManagement .dll Windows.System.UserDeviceAssociation.dll Windows.System.UserProfile.DiagnosticsSettings.dll Windows.UI.Accessibility.dll Windows.UI.AppDefaults.dll Windows.UI.BioFeedback.dll Windows.UI.BlockedShutdown.dll Windows.UI.Core.TextInput.dll Windows.UI.Cred.dll Window s.UI.CredDialogController.dll Windows.UI.dll Windows.UI.FileExplorer.dll Windows.UI.Immersive.dll Windows.UI.Input.Inking.Analysis.dll Windows.UI.Input.Inking.dll Windows.UI.Internal.Input.ExpressiveInput.dll Windows.UI.Internal.Input.ExpressiveInput.Resource.dll Win dows.UI.Logon.dll Windows.UI.NetworkUXController.dll Windows.UI.PicturePassword.dll Windows.UI.Search.dll Windows.UI.Shell.dll Windows.UI.Shell.Internal.AdaptiveCards.dll Windows.UI.Storage.dll Windows.UI.Xaml.Controls.dll Windows.UI.Xaml.dll Windows.UI.Xaml.InkContr ols.dll Windows.UI.Xaml.Maps.dll Windows.UI.Xaml.Phone.dll Windows.UI.Xaml.Resources.19h1.dll Windows.UI.Xaml.Resources.Common.dll Windows.UI.Xaml.Resources.rs1.dll Windows.UI.Xaml.Resources.rs2.dll Windows.UI.Xaml.Resources.rs3.dll Windows.UI.Xaml.Resources.rs4.dll Windows.UI.Xaml.Resources.rs5.dll Windows.UI.Xaml.Resources.th.dll Windows.UI.Xaml.Resources.win81.dll Windows.UI.Xaml.Resources.win8rtm.dll Windows.UI.XamlHost.dll Windows.WARP.JITService.dll Windows.WARP.JITService.exe Windows.Web.Diagnostics.dll Windows.Web.dll Wi ndows.Web.Http.dll WindowsActionDialog.exe WindowsCodecs.dll WindowsCodecsExt.dll WindowsCodecsRaw.dll WindowsCodecsRaw.txt WindowsDefaultHeatProcessor.dll windowsdefenderapplicationguardcsp.dll WindowsInternal.ComposableShell.ComposerFramework.dll WindowsInternal.Co mposableShell.DesktopHosting.dll WindowsInternal.Shell.CompUiActivation.dll WindowsIoTCsp.dll windowslivelogin.dll WindowsManagementServiceWinRt.ProxyStub.dll windowsperformancerecordercontrol.dll WindowsPowerShell WindowsSecurityIcon.png windowsudk.shellcommon.dll W indowsUpdateElevatedInstaller.exe winethc.dll winevt WinFax.dll winhttp.dll winhttpcom.dll WinHvEmulation.dll WinHvPlatform.dll wininet.dll wininetlui.dll wininit.exe wininitext.dll winipcfile.dll winipcsecproc.dll winipsec.dll winjson.dll Winlangdb.dll winload.efi w inload.exe winlogon.exe winlogonext.dll winmde.dll WinMetadata winml.dll winmm.dll winmmbase.dll winmsipc.dll WinMsoIrmProtector.dll winnlsres.dll winnsi.dll WinOpcIrmProtector.dll WinREAgent.dll winresume.efi winresume.exe winrm winrm.cmd winrm.vbs winrnr.dll winrs. exe winrscmd.dll winrshost.exe winrsmgr.dll winrssrv.dll WinRTNetMUAHostServer.exe WinRtTracing.dll WinSAT.exe WinSATAPI.dll WinSCard.dll WinSetupUI.dll winshfhc.dll winsku.dll winsockhc.dll winspool.drv winsqlite3.dll WINSRPC.DLL winsrv.dll winsrvext.dll winsta.dll WinSync.dll WinSyncMetastore.dll WinSyncProviders.dll wintrust.dll WinTypes.dll winusb.dll winver.exe WiredNetworkCSP.dll wisp.dll witnesswmiv2provider.dll wkscli.dll wkspbroker.exe wkspbrokerAx.dll wksprt.exe wksprtPS.dll wkssvc.dll wlanapi.dll wlancfg.dll WLanConn. dll wlandlg.dll wlanext.exe wlangpui.dll WLanHC.dll wlanhlp.dll WlanMediaManager.dll WlanMM.dll wlanmsm.dll wlanpref.dll WlanRadioManager.dll wlansec.dll wlansvc.dll wlansvcpal.dll wlanui.dll wlanutil.dll Wldap32.dll wldp.dll wlgpclnt.dll wlidcli.dll wlidcredprov.dll wlidfdp.dll wlidnsp.dll wlidprov.dll wlidres.dll wlidsvc.dll wlrmdr.exe WMADMOD.DLL WMADMOE.DLL WMALFXGFXDSP.dll WMASF.DLL wmcodecdspps.dll wmdmlog.dll wmdmps.dll wmdrmsdk.dll wmerror.dll wmi.dll wmiclnt.dll wmicmiplugin.dll wmidcom.dll wmidx.dll WmiMgmt.msc wmiprop .dll wmitomi.dll WMNetMgr.dll wmp.dll WMPDMC.exe WmpDui.dll wmpdxm.dll wmpeffects.dll WMPhoto.dll wmploc.DLL wmpps.dll wmpshell.dll wmsgapi.dll WMSPDMOD.DLL WMSPDMOE.DLL WMVCORE.DLL WMVDECOD.DLL wmvdspa.dll WMVENCOD.DLL WMVSDECD.DLL WMVSENCD.DLL WMVXENCD.DLL WofTasks .dll WofUtil.dll WordBreakers.dll WorkFolders.exe WorkfoldersControl.dll WorkFoldersGPExt.dll WorkFoldersRes.dll WorkFoldersShell.dll workfolderssvc.dll wosc.dll wow64.dll wow64cpu.dll wow64win.dll wowreg32.exe WpAXHolder.dll wpbcreds.dll Wpc.dll WpcApi.dll wpcatltoa st.png WpcDesktopMonSvc.dll WpcMon.exe wpcmon.png WpcProxyStubs.dll WpcRefreshTask.dll WpcTok.exe WpcWebFilter.dll wpdbusenum.dll WpdMtp.dll WpdMtpUS.dll wpdshext.dll WPDShextAutoplay.exe WPDShServiceObj.dll WPDSp.dll wpd_ci.dll wpnapps.dll wpnclient.dll wpncore.dll wpninprc.dll wpnpinst.exe wpnprv.dll wpnservice.dll wpnsruprov.dll WpnUserService.dll WpPortingLibrary.dll WppRecorderUM.dll wpr.config.xml wpr.exe WPTaskScheduler.dll wpx.dll write.exe ws2help.dll ws2_32.dll wscadminui.exe wscapi.dll wscinterop.dll wscisvif.dll WSCl ient.dll WSCollect.exe wscproxystub.dll wscript.exe wscsvc.dll wscui.cpl WSDApi.dll wsdchngr.dll WSDPrintProxy.DLL WsdProviderUtil.dll WSDScanProxy.dll wsecedit.dll wsepno.dll wshbth.dll wshcon.dll wshelper.dll wshext.dll wshhyperv.dll wship6.dll wshom.ocx wshqos.dll wshrm.dll WSHTCPIP.DLL wshunix.dll wsl.exe wslapi.dll WsmAgent.dll wsmanconfig_schema.xml WSManHTTPConfig.exe WSManMigrationPlugin.dll WsmAuto.dll wsmplpxy.dll wsmprovhost.exe WsmPty.xsl WsmRes.dll WsmSvc.dll WsmTxt.xsl WsmWmiPl.dll wsnmp32.dll wsock32.dll wsplib.dl l wsp_fs.dll wsp_health.dll wsp_sr.dll wsqmcons.exe WSReset.exe WSTPager.ax wtsapi32.dll wuapi.dll wuapihost.exe wuauclt.exe wuaueng.dll wuceffects.dll WUDFCoinstaller.dll WUDFCompanionHost.exe WUDFHost.exe WUDFPlatform.dll WudfSMCClassExt.dll WUDFx.dll WUDFx02000.dl l wudriver.dll wups.dll wups2.dll wusa.exe wuuhext.dll wuuhosdeployment.dll wvc.dll WwaApi.dll WwaExt.dll WWAHost.exe WWanAPI.dll wwancfg.dll wwanconn.dll WWanHC.dll wwanmm.dll Wwanpref.dll wwanprotdim.dll WwanRadioManager.dll wwansvc.dll wwapi.dll XamlTileRender.dll XAudio2_8.dll XAudio2_9.dll XblAuthManager.dll XblAuthManagerProxy.dll XblAuthTokenBrokerExt.dll XblGameSave.dll XblGameSaveExt.dll XblGameSaveProxy.dll XblGameSaveTask.exe XboxGipRadioManager.dll xboxgipsvc.dll xboxgipsynthetic.dll XboxNetApiSvc.dll xcopy.exe XInput1_4.dll XInput9_1_0.dll XInputUap.dll xmlfilter.dll xmllite.dll xmlprovi.dll xolehlp.dll XpsDocumentTargetPrint.dll XpsGdiConverter.dll XpsPrint.dll xpspushlayer.dll XpsRasterService.dll xpsservices.dll XpsToPclmConverter.dll XpsToPwgrConverter.dll xwizard.dtd xwizard.exe xwizards.dll xwreg.dll xwtpdui.dll xwtpw32.dll X_80.contrast-black.png X_80.contrast-white.png X_80.png ze_loader.dll ze_tracing_layer.dll ze_validation_layer.dll zh-CN zh-TW zipcontainer.dll zipfldr.dll ztrace_maps.dll /cygdrive/c/Windows: addins AhnInst.log appcompat Application Data apppatch AppReadiness assembly bcastdvr bfsvc.exe BitLockerDiscoveryVolumeContents Boot bootstat.dat Branding CbsTemp Containers CSC Cursors debug diagnostics DiagTrack DigitalLocker Downloaded Program Files DtcInstall.log ELAMBKUP en-US explorer.exe Fonts GameBarPresenceWriter gethelp_audiotroubleshooter_latestpackage.zip Globalization Help HelpPane.exe hh.exe hipiw.dll IdentityCRL ImageSAFERSvc.exe IME IMGSF50Svc.exe ImmersiveControlPanel INF InputMethod Installer ko-KR L2Schemas LanguageOverlayCache LiveKernelReports Logs lsasetup.log Media mib.bin Microsoft.NET Migration ModemLogs notepad.exe OCR Offline Web Pages Panther Performance PFRO.log PLA PolicyDefinitions Prefetch PrintDialog Professional.xml Provisioning regedit.exe Registration RemotePackages rescache Resources RtlExUpd.dll SchCache schemas security ServiceProfiles ServiceState servicing Setup setupact.log setuperr.log ShellComponents ShellExperiences SHELLNEW SKB SoftwareDistribution Speech Speech_OneCore splwow64. exe System system.ini System32 SystemApps SystemResources SystemTemp SysWOW64 TAPI Tasks Temp TempInst tracing twain_32 twain_32.dll Vss WaaS Web win.ini WindowsShell.Manifest WindowsUpdate.log winhlp32.exe WinSxS WMSysPr9.prx write.exe /cygdrive/c/Windows/System32/Wbem: aeinv.mof AgentWmi.mof AgentWmiUninstall.mof appbackgroundtask.dll appbackgroundtask.mof appbackgroundtask_uninstall.mof AuditRsop.mof authfwcfg.mof AutoRecover bcd.mof BthMtpEnum.mof cimdmtf.mof cimwin32.dll cimwin32.mof CIWm i.mof classlog.mof cli.mof cliegaliases.mof ddp.mof dimsjob.mof dimsroam.mof DMWmiBridgeProv.dll DMWmiBridgeProv.mof DMWmiBridgeProv1.dll DMWmiBridgeProv1.mof DMWmiBridgeProv1_Uninstall.mof DMWmiBridgeProv_Uninstall.mof dnsclientcim.dll dnsclientcim.mof dnsclientpspr ovider.dll dnsclientpsprovider.mof dnsclientpsprovider_Uninstall.mof drvinst.mof DscCore.mof DscCoreConfProv.mof dscproxy.mof Dscpspluginwkr.dll DscTimer.mof dsprov.dll dsprov.mof eaimeapi.mof EmbeddedLockdownWmi.dll embeddedlockdownwmi.mof embeddedlockdownwmi_Uninst all.mof en en-US esscli.dll EventTracingManagement.dll EventTracingManagement.mof fastprox.dll fdPHost.mof fdrespub.mof fdSSDP.mof fdWNet.mof fdWSD.mof filetrace.mof firewallapi.mof FolderRedirectionWMIProvider.mof FunDisc.mof fwcfg.mof hbaapi.mof hnetcfg.mof IMAPIv2 -Base.mof IMAPIv2-FileSystemSupport.mof IMAPIv2-LegacyShim.mof interop.mof IpmiDTrc.mof ipmiprr.dll ipmiprv.dll ipmiprv.mof IpmiPTrc.mof ipsecsvc.mof iscsidsc.mof iscsihba.mof iscsiprf.mof iscsirem.mof iscsiwmiv2.mof iscsiwmiv2_uninstall.mof kerberos.mof ko ko-KR Krn lProv.dll krnlprov.mof L2SecHC.mof lltdio.mof lltdsvc.mof Logs lsasrv.mof mblctr.mof MDMAppProv.dll MDMAppProv.mof MDMAppProv_Uninstall.mof MDMSettingsProv.dll MDMSettingsProv.mof MDMSettingsProv_Uninstall.mof Microsoft-Windows-OfflineFiles.mof Microsoft-Windows-Remo te-FileSystem.mof Microsoft.AppV.AppVClientWmi.dll Microsoft.AppV.AppVClientWmi.mof Microsoft.Uev.AgentWmi.dll Microsoft.Uev.ManagedAgentWmi.mof Microsoft.Uev.ManagedAgentWmiUninstall.mof mispace.mof mispace_uninstall.mof mmc.mof MMFUtil.dll MOF mofcomp.exe mofd.dll mofinstall.dll mountmgr.mof mpeval.mof mpsdrv.mof mpssvc.mof msdtcwmi.dll MsDtcWmi.mof msfeeds.mof msfeedsbs.mof msi.mof msiprov.dll msiscsi.mof MsNetImPlatform.mof mstsc.mof mstscax.mof msv1_0.mof mswmdm.mof NCProv.dll ncprov.mof ncsi.mof ndisimplatcim.dll ndistrace .mof NetAdapterCim.dll NetAdapterCim.mof NetAdapterCimTrace.mof NetAdapterCimTraceUninstall.mof NetAdapterCim_uninstall.mof netdacim.dll netdacim.mof netdacim_uninstall.mof NetEventPacketCapture.dll NetEventPacketCapture.mof NetEventPacketCapture_uninstall.mof netncc im.dll netnccim.mof netnccim_uninstall.mof NetPeerDistCim.dll NetPeerDistCim.mof NetPeerDistCim_uninstall.mof netprofm.mof NetSwitchTeam.mof netswitchteamcim.dll NetTCPIP.dll NetTCPIP.mof NetTCPIP_Uninstall.mof netttcim.dll netttcim.mof netttcim_uninstall.mof network itemfactory.mof newdev.mof nlasvc.mof nlmcim.dll nlmcim.mof nlmcim_uninstall.mof nlsvc.mof npivwmi.mof nshipsec.mof ntevt.dll ntevt.mof ntfs.mof OfflineFilesConfigurationWmiProvider.mof OfflineFilesConfigurationWmiProvider_Uninstall.mof OfflineFilesWmiProvider.mof Of flineFilesWmiProvider_Uninstall.mof p2p-mesh.mof p2p-pnrp.mof pcsvDevice.mof pcsvDevice_Uninstall.mof Performance PNPXAssoc.mof PolicMan.dll PolicMan.mof polproc.mof polprocl.mof polprou.mof polstore.mof portabledeviceapi.mof portabledeviceclassextension.mof portable deviceconnectapi.mof portabledevicetypes.mof portabledevicewiacompat.mof powermeterprovider.mof PowerPolicyProvider.mof ppcRsopCompSchema.mof ppcRsopUserSchema.mof PrintFilterPipelineSvc.mof PrintManagementProvider.dll PrintManagementProvider.mof PrintManagementProvider_Uninstall.mof profileassociationprovider.mof PS_MMAgent.mof qmgr.mof qoswmi.dll qoswmi.mof qoswmitrc.mof qoswmitrc_uninstall.mof qoswmi_uninstall.mof RacWmiProv.dll RacWmiProv.mof rawxml.xsl rdpendp.mof rdpinit.mof rdpshell.mof refs.mof refsv1.mof regevent.mof Remove.Microsoft.AppV.AppvClientWmi.mof repdrvfs.dll Repository rsop.mof rspndr.mof samsrv.mof scersop.mof schannel.mof schedprov.dll SchedProv.mof scm.mof scrcons.exe scrcons.mof sdbus.mof secrcw32.mof SensorsClassExtension.mof ServDeps.dll ServiceModel.mof ServiceModel.mof.uninstall ServiceModel35.mof ServiceModel35.mof.uninstall services.mof setupapi.mof SmbWitnessWmiv2Provider.mof smbwmiv2.mof SMTPCons.dll smtpcons.mof sppwmi.mof sr.mof sstpsvc.mof stdprov.dll storagewmi.mof storagewmi_passthru.mof storagewmi_passthru_uninstall.mof storagewmi_uninstall.mof stortrace.mof subscrpt.mof system.mof tcpip.mof texttable.xsl textvaluelist.xsl tmf tsallow.mof tscfgwmi.mof tsmf.mof tspkg.mof umb.mof umbus.mof umpass.mof umpnpmgr.mof unsecapp.exe UserProfileConfigurationWmiProvider.mof UserProfileWmiProvider.mof UserStateWMIProvider.mof vds.mof vdswmi.dll viewprov.dll vpnclientpsprovider.dll vpnclientpsprovider.mof vpnclientpsprovider_Uninstall.mof vss.mof vsswmi.dll wbemcntl.dll wbemcons.dll WBEMCons.mof wbemcore.dll wbemdisp.dll wbemdisp.tlb wbemess.dll wbemprox.dll wbemsvc.dll wbemtest.exe wcncsvc.mof WdacEtwProv.mof WdacWmiProv.dll WdacWmiProv.mof WdacWmiProv_Uninstall.mof Wdf01000.mof Wdf01000Uninstall.mof wdigest.mof WFAPIGP.mof wfascim.dll wfascim.mof wfascim_uninstall.mof WFP.MOF wfs.mof whqlprov.mof Win32_DeviceGuard.mof Win32_EncryptableVolume.dll win32_encryptablevolume.mof Win32_EncryptableVolumeUninstall.mof win32_printer.mof Win32_Tpm.dll Win32_Tpm.mof wininit.mof winipsec.mof winlogon.mof WinMgmt.exe WinMgmtR.dll Winsat.mof WinsatUninstall.mof wlan.mof WLanHC.mof wmi.mof WMIADAP.exe WmiApRes.dll WmiApRpl.dll WmiApSrv.exe WMIC.exe WMICOOKR.dll WmiDcPrv.dll wmipcima.dll wmipcima.mof wmipdfs.dll wmipdfs.mof wmipdskq.dll wmipdskq.mof WmiPerfClass.dll WmiPerfClass.mof WmiPerfInst.dll WmiPerfInst.mof WMIPICMP.dll wmipicmp.mof WMIPIPRT.dll wmipiprt.mof WMIPJOBJ.dll wmipjobj.mof wmiprov.dll WmiPrvSD.dll WmiPrvSE.exe WMIPSESS.dll wmipsess.mof WMIsvc.dll wmitimep.dll wmitimep.mof wmiutils.dll WMI_Tracing.mof wmp.mof wmpnetwk.mof wpdbusenum.mof wpdcomp.mof wpdfs.mof wpdmtp.mof wpdshext.mof WPDShServiceObj.mof wpdsp.mof wpd_ci.mof wscenter.mof WsmAgent.mof WsmAgentUninstall.mof WsmAuto.mof wsp_fs.mof wsp_fs_uninstall.mof wsp_health.mof wsp_health_uninstall.mof wsp_sr.mof wsp_sr_uninstall.mof WUDFx.mof Wudfx02000.mof Wudfx02000Uninstall.mof WUDFxUninstall.mof xml xsl-mappings.xml xwizards.mof /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0: Certificate.format.ps1xml Diagnostics.Format.ps1xml DotNetTypes.format.ps1xml en en-US Event.Format.ps1xml Examples FileSystem.format.ps1xml getevent.types.ps1xml Help.format.ps1xml HelpV3.format.ps1xml ko ko-KR Modules powershell.exe powershell.exe.config PowerShellCore.format.ps1xml PowerShellTrace.format.ps1xml powershell_ise.exe powershell_ise.exe.config PSEvents.dll pspluginwkr.dll pwrshmsg.dll pwrshsip.dll Registry.format.ps1xml Schemas SessionConfig types.ps1xml typesv3.ps1xml WSMan.Format.ps1xml /cygdrive/c/Windows/System32/OpenSSH: scp.exe sftp.exe ssh-add.exe ssh-agent.exe ssh-keygen.exe ssh-keyscan.exe ssh.exe /cygdrive/c/Program Files/MATLAB/R2020b/bin: crash_analyzer.cfg icutzdata lcdata.xml lcdata.xsd lcdata_utf8.xml m3iregistry matlab.exe mex.bat mexext.bat util win32 win64 /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn: Resources SqlLocalDB.exe /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn: batchparser.dll bcp.exe Resources SQLCMD.EXE xmlrw.dll /cygdrive/c/Program Files/Git/cmd: git-gui.exe git-lfs.exe git.exe gitk.exe start-ssh-agent.cmd start-ssh-pageant.cmd Warning accessing /cygdrive/c/msys64/mingw64/bin gives errors: [Errno 2] No such file or directory: '/cygdrive/c/msys64/mingw64/bin' Warning accessing /cygdrive/c/msys64/usr/bin gives errors: [Errno 2] No such file or directory: '/cygdrive/c/msys64/usr/bin' /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config llvm-symbolizer.exe LocalESPC.dll Microsoft.Diagnostics.Tracing.EventSource.dll Microsoft.VisualStudio.RemoteControl.dll Microsoft.VisualStudio.Telemetry.dll Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll nmake.exe onecore perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll vctip.exe xdcmake.exe xdcmake.exe.config /cygdrive/c/Program Files/dotnet: dotnet.exe host LICENSE.txt packs sdk shared templates ThirdPartyNotices.txt /: bin Cygwin-Terminal.ico Cygwin.bat Cygwin.ico dev etc home lib mpich-4.0.2 mpich-4.0.2.tar.gz sbin tmp usr var proc cygdrive /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps: Backup GameBarElevatedFT_Alias.exe Microsoft.DesktopAppInstaller_8wekyb3d8bbwe Microsoft.MicrosoftEdge_8wekyb3d8bbwe Microsoft.SkypeApp_kzf8qxf38zg5c Microsoft.XboxGamingOverlay_8wekyb3d8bbwe MicrosoftEdge.exe python.exe python3.exe Skype.exe winget.exe /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin: code code.cmd /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config llvm-symbolizer.exe LocalESPC.dll Microsoft.Diagnostics.Tracing.EventSource.dll Microsoft.VisualStudio.RemoteControl.dll Microsoft.VisualStudio.Telemetry.dll Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll nmake.exe onecore perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll vctip.exe xdcmake.exe xdcmake.exe.config Warning accessing /cygdrive/c/Users/SEJONG/.dotnet/tools gives errors: [Errno 2] No such file or directory: '/cygdrive/c/Users/SEJONG/.dotnet/tools' /usr/lib/lapack: cygblas-0.dll cyglapack-0.dll ============================================================================================= TESTING: configureExternalPackagesDir from config.framework(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py:1045) Set alternative directory external packages are built in serialEvaluation: initial cxxDialectRanges ('c++11', 'c++17') serialEvaluation: new cxxDialectRanges ('c++11', 'c++17') child config.utilities.macosFirewall took 0.000005 seconds ============================================================================================= TESTING: configureDebuggers from config.utilities.debuggers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/utilities/debuggers.py:20) Find a default debugger and determine its arguments Checking for program /usr/local/bin/gdb...not found Checking for program /usr/bin/gdb...not found Checking for program /cygdrive/c/SIMULIA/Commands/gdb...not found Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/gdb...not found Checking for program /cygdrive/c/Windows/system32/gdb...not found Checking for program /cygdrive/c/Windows/gdb...not found Checking for program /cygdrive/c/Windows/System32/Wbem/gdb...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/gdb...not found Checking for program /cygdrive/c/Windows/System32/OpenSSH/gdb...not found Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/gdb...not found Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/gdb...not found Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/gdb...not found Checking for program /cygdrive/c/Program Files/Git/cmd/gdb...not found Checking for program /cygdrive/c/msys64/mingw64/bin/gdb...not found Checking for program /cygdrive/c/msys64/usr/bin/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not found Checking for program /cygdrive/c/Program Files/dotnet/gdb...not found Checking for program /gdb...not found Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/gdb...not found Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not found Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/gdb...not found Checking for program /usr/lib/lapack/gdb...not found Checking for program /usr/local/bin/dbx...not found Checking for program /usr/bin/dbx...not found Checking for program /cygdrive/c/SIMULIA/Commands/dbx...not found Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/dbx...not found Checking for program /cygdrive/c/Windows/system32/dbx...not found Checking for program /cygdrive/c/Windows/dbx...not found Checking for program /cygdrive/c/Windows/System32/Wbem/dbx...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/dbx...not found Checking for program /cygdrive/c/Windows/System32/OpenSSH/dbx...not found Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/dbx...not found Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/dbx...not found Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/dbx...not found Checking for program /cygdrive/c/Program Files/Git/cmd/dbx...not found Checking for program /cygdrive/c/msys64/mingw64/bin/dbx...not found Checking for program /cygdrive/c/msys64/usr/bin/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not found Checking for program /cygdrive/c/Program Files/dotnet/dbx...not found Checking for program /dbx...not found Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/dbx...not found Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not found Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/dbx...not found Checking for program /usr/lib/lapack/dbx...not found Defined make macro "DSYMUTIL" to "true" child config.utilities.debuggers took 0.014310 seconds ============================================================================================= TESTING: configureDirectories from PETSc.options.petscdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscdir.py:22) Checks PETSC_DIR and sets if not set PETSC_VERSION_RELEASE of 1 indicates the code is from a release branch or a branch created from a release branch. Version Information: #define PETSC_VERSION_RELEASE 1 #define PETSC_VERSION_MAJOR 3 #define PETSC_VERSION_MINOR 18 #define PETSC_VERSION_SUBMINOR 1 #define PETSC_VERSION_DATE "Oct 26, 2022" #define PETSC_VERSION_GIT "v3.18.1" #define PETSC_VERSION_DATE_GIT "2022-10-26 07:57:29 -0500" #define PETSC_VERSION_EQ(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_ PETSC_VERSION_EQ #define PETSC_VERSION_LT(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_LE(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_GT(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_GE(MAJOR,MINOR,SUBMINOR) \ child PETSc.options.petscdir took 0.015510 seconds ============================================================================================= TESTING: getDatafilespath from PETSc.options.dataFilesPath(/home/SEJONG/petsc-3.18.1/config/PETSc/options/dataFilesPath.py:29) Checks what DATAFILESPATH should be child PETSc.options.dataFilesPath took 0.002462 seconds ============================================================================================= TESTING: configureGit from config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:24) Find the Git executable Checking for program /usr/local/bin/git...not found Checking for program /usr/bin/git...found Defined make macro "GIT" to "git" Executing: git --version stdout: git version 2.38.1 ============================================================================================= TESTING: configureMercurial from config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:35) Find the Mercurial executable Checking for program /usr/local/bin/hg...not found Checking for program /usr/bin/hg...not found Checking for program /cygdrive/c/SIMULIA/Commands/hg...not found Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/hg...not found Checking for program /cygdrive/c/Windows/system32/hg...not found Checking for program /cygdrive/c/Windows/hg...not found Checking for program /cygdrive/c/Windows/System32/Wbem/hg...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/hg...not found Checking for program /cygdrive/c/Windows/System32/OpenSSH/hg...not found Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/hg...not found Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/hg...not found Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/hg...not found Checking for program /cygdrive/c/Program Files/Git/cmd/hg...not found Checking for program /cygdrive/c/msys64/mingw64/bin/hg...not found Checking for program /cygdrive/c/msys64/usr/bin/hg...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not found Checking for program /cygdrive/c/Program Files/dotnet/hg...not found Checking for program /hg...not found Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/hg...not found Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/hg...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not found Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/hg...not found Checking for program /usr/lib/lapack/hg...not found Checking for program /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/hg...not found child config.sourceControl took 0.121914 seconds ============================================================================================= TESTING: configureInstallationMethod from PETSc.options.petscclone(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscclone.py:20) Determine if PETSc was obtained via git or a tarball This is a tarball installation child PETSc.options.petscclone took 0.003125 seconds ============================================================================================= TESTING: setNativeArchitecture from PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:29) Forms the arch as GNU's configure would form it ============================================================================================= TESTING: configureArchitecture from PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:42) Checks if PETSC_ARCH is set and sets it if not set No previous hashfile found Setting hashfile: arch-mswin-c-debug/lib/petsc/conf/configure-hash Deleting configure hash file: arch-mswin-c-debug/lib/petsc/conf/configure-hash Unable to delete configure hash file: arch-mswin-c-debug/lib/petsc/conf/configure-hash child PETSc.options.arch took 0.149094 seconds ============================================================================================= TESTING: setInstallDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:31) Set installDir to either prefix or if that is not set to PETSC_DIR/PETSC_ARCH Defined make macro "PREFIXDIR" to "/home/SEJONG/petsc-3.18.1/arch-mswin-c-debug" ============================================================================================= TESTING: saveReconfigure from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:76) Save the configure options in a script in PETSC_ARCH/lib/petsc/conf so the same configure may be easily re-run ============================================================================================= TESTING: cleanConfDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:68) Remove all the files from configuration directory for this PETSC_ARCH, from --with-clean option ============================================================================================= TESTING: configureInstallDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:52) Makes installDir subdirectories if it does not exist for both prefix install location and PETSc work install location Changed persistence directory to /home/SEJONG/petsc-3.18.1/arch-mswin-c-debug/lib/petsc/conf TESTING: restoreReconfigure from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:90) If --with-clean was requested but restoring the reconfigure file was requested then restore it child PETSc.options.installDir took 0.006476 seconds ============================================================================================= TESTING: setExternalPackagesDir from PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:15) Set location where external packages will be downloaded to ============================================================================================= TESTING: cleanExternalpackagesDir from PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:23) Remove all downloaded external packages, from --with-clean child PETSc.options.externalpackagesdir took 0.000990 seconds ============================================================================================= TESTING: configureCLanguage from PETSc.options.languages(/home/SEJONG/petsc-3.18.1/config/PETSc/options/languages.py:28) Choose whether to compile the PETSc library using a C or C++ compiler C language is C Defined "CLANGUAGE_C" to "1" Defined make macro "CLANGUAGE" to "C" child PETSc.options.languages took 0.003172 seconds ============================================================================================= TESTING: resetEnvCompilers from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2652) Remove compilers from the shell environment so they do not interfer with testing ============================================================================================= TESTING: checkEnvCompilers from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2669) Set configure compilers from the environment, from -with-environment-variables ============================================================================================= TESTING: checkMPICompilerOverride from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2622) Check if --with-mpi-dir is used along with CC CXX or FC compiler options. This usually prevents mpi compilers from being used - so issue a warning ============================================================================================= TESTING: requireMpiLdPath from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2643) OpenMPI wrappers require LD_LIBRARY_PATH set ============================================================================================= TESTING: checkInitialFlags from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:723) Initialize the compiler and linker flags Initialized CFLAGS to Initialized CFLAGS to Initialized LDFLAGS to Initialized CUDAFLAGS to Initialized CUDAFLAGS to Initialized LDFLAGS to Initialized HIPFLAGS to Initialized HIPFLAGS to Initialized LDFLAGS to Initialized SYCLFLAGS to Initialized SYCLFLAGS to Initialized LDFLAGS to Initialized CXXFLAGS to Initialized CXX_CXXFLAGS to Initialized LDFLAGS to Initialized FFLAGS to Initialized FFLAGS to Initialized LDFLAGS to Initialized CPPFLAGS to Initialized FPPFLAGS to Initialized CUDAPPFLAGS to -Wno-deprecated-gpu-targets Initialized CXXPPFLAGS to Initialized HIPPPFLAGS to Initialized SYCLPPFLAGS to Initialized CC_LINKER_FLAGS to [] Initialized CXX_LINKER_FLAGS to [] Initialized FC_LINKER_FLAGS to [] Initialized CUDAC_LINKER_FLAGS to [] Initialized HIPC_LINKER_FLAGS to [] Initialized SYCLC_LINKER_FLAGS to [] TESTING: checkCCompiler from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:1341) Locate a functional C compiler Checking for program /usr/local/bin/mpicc...not found Checking for program /usr/bin/mpicc...found Defined make macro "CC" to "mpicc" Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o -I/tmp/petsc-uqt11yqc/config.setCompilers /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o -I/tmp/petsc-uqt11yqc/config.setCompilers /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicc -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.exe /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o Possible ERROR while running linker: exit code 1 stderr: /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory collect2: error: ld returned 1 exit status Linker output before filtering: /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory collect2: error: ld returned 1 exit status : Linker output after filtering: /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory collect2: error: ld returned 1 exit status: Error testing C compiler: Cannot compile/link C with mpicc. MPI compiler wrapper mpicc failed to compile Executing: mpicc -show stdout: gcc -L/usr/lib -lmpi -lopen-rte -lopen-pal -lhwloc -levent_core -levent_pthreads -lz MPI compiler wrapper mpicc is likely incorrect. Use --with-mpi-dir to indicate an alternate MPI. Deleting "CC" ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- C compiler you provided with -with-cc=mpicc cannot be found or does not work. Cannot compile/link C with mpicc. ******************************************************************************* File "/home/SEJONG/petsc-3.18.1/config/configure.py", line 461, in petsc_configure framework.configure(out = sys.stdout) File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1412, in configure self.processChildren() File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1400, in processChildren self.serialEvaluation(self.childGraph) File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1375, in serialEvaluation child.configure() File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 2712, in configure self.executeTest(self.checkCCompiler) File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/base.py", line 138, in executeTest ret = test(*args,**kargs) File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 1346, in checkCCompiler for compiler in self.generateCCompilerGuesses(): File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 1274, in generateCCompilerGuesses raise RuntimeError('C compiler you provided with -with-cc='+self.argDB['with-cc']+' cannot be found or does not work.'+'\n'+self.mesg) ================================================================================ Finishing configure run at Tue, 01 Nov 2022 13:06:09 +0900 -----Original Message----- From: Satish Balay > Sent: Tuesday, November 1, 2022 11:36 AM To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: RE: [petsc-users] PETSc Windows Installation you'll have to send configure.log for this failure Satish On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > I have checked the required Cygwin openmpi libraries and they are all installed. When I run ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90, it returns: > > $ ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > ============================================================================================= > Configuring PETSc to compile on your system > ====================================================================== > ======================= > TESTING: checkCCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > ---------------------------------------------------------------------- > --------- C compiler you provided with -with-cc=mpicc cannot be found > or does not work. > Cannot compile/link C with mpicc. > > As for the case of WSL2, I will try to install that on my PC. > Meanwhile, could you please look into this issue > > Thank you > > Ali > > -----Original Message----- > From: Satish Balay > > Sent: Monday, October 31, 2022 10:56 PM > To: Satish Balay via petsc-users > > Cc: Matthew Knepley >; Mohammad Ali Yaqteen > > > Subject: Re: [petsc-users] PETSc Windows Installation > > BTW: If you have WSL2 on windows - it might be easier to build/use PETSc. > > Satish > > On Mon, 31 Oct 2022, Satish Balay via petsc-users wrote: > > > Make sure you have cygwin openmpi installed [and cywin blas/lapack] > > > > $ cygcheck -cd |grep openmpi > > libopenmpi-devel 4.1.2-1 > > libopenmpi40 4.1.2-1 > > libopenmpifh40 4.1.2-1 > > libopenmpiusef08_40 4.1.2-1 > > libopenmpiusetkr40 4.1.2-1 > > openmpi 4.1.2-1 > > $ cygcheck -cd |grep lapack > > liblapack-devel 3.10.1-1 > > liblapack0 3.10.1-1 > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > --download-f2cblaslapack > > > > Should be: > > > > > > $ ./configure --download-scalapack --download-mumps > > > > i.e [default] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 [an > > default cygwin blas/lapack] > > > > Satish > > > > > > On Mon, 31 Oct 2022, Matthew Knepley wrote: > > > > > On Mon, Oct 31, 2022 at 1:56 AM Mohammad Ali Yaqteen > > > > > > > wrote: > > > > > > > Dear Satish > > > > > > > > When I configure PETSc with (./configure --with-cc=gcc > > > > --with-cxx=0 > > > > --with-fc=0 --download-f2cblaslapack) it runs as I shared > > > > initially which you said is not an issue anymore. But when I add > > > > (--download-scalapack > > > > --download-mumps) or configure with these later, it gives the > > > > following > > > > error: > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > > > ============================================================================================= > > > > Configuring PETSc to compile on your > > > > system > > > > > > > > ================================================================ > > > > == > > > > =========================== > > > > TESTING: FortranMPICheck from > > > > config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > > > details): > > > > > > > > ---------------------------------------------------------------- > > > > -- > > > > ------------- Fortran error! mpi_init() could not be located! > > > > > > > > **************************************************************** > > > > ** > > > > ************* > > > > > > > > What could be the problem here? > > > > > > > > > > Without configure.log we cannot tell what went wrong. However, > > > from the error message, I would guess that your MPI was not built > > > with Fortran bindings. You need these for those packages. > > > > > > Thanks, > > > > > > Matt > > > > > > > > > > Your help is highly appreciated. > > > > > > > > Thank you > > > > Ali > > > > > > > > -----Original Message----- > > > > From: Satish Balay > > > > > Sent: Saturday, October 29, 2022 2:11 PM > > > > To: Mohammad Ali Yaqteen > > > > > Cc: Matthew Knepley >; petsc-users at mcs.anl.gov > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > > > > > > > > > I haven?t accessed PETSC or given any command of my own. I was > > > > > just > > > > installing by following the instructions. I don?t know why it is > > > > attaching the debugger. Although it says ?Possible error running > > > > C/C++ > > > > src/snes/tutorials/ex19 with 1 MPI process? which I think is > > > > indicating of missing of MPI! > > > > > > > > The diff is not smart enough to detect the extra message from > > > > cygwin/OpenMPI - hence it assumes there is a potential problem - > > > > and prints the above message. > > > > > > > > But you can assume its installed properly - and use it. > > > > > > > > Satish > > > > > > > > > > From: Matthew Knepley > > > > > > Sent: Friday, October 28, 2022 10:31 PM > > > > > To: Mohammad Ali Yaqteen > > > > > > Cc: petsc-users at mcs.anl.gov > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen < > > > > mhyaqteen at sju.ac.kr>> wrote: > > > > > Dear Sir, > > > > > > > > > > During the Installation of PETSc in windows, I installed > > > > > Cygwin and the > > > > required libraries as mentioned on your website: > > > > > [cid:image001.png at 01D8EB93.7C17E410] > > > > > However, when I install PETSc using the configure commands > > > > > present on > > > > the petsc website: > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > --download-f2cblaslapack --download-mpich > > > > > > > > > > it gives me the following error: > > > > > > > > > > [cid:image002.png at 01D8EB93.7C17E410] > > > > > > > > > > I already installed OpenMPI using Cygwin installer but it > > > > > still asks me > > > > to. When I configure without ??download-mpich? and run ?make check? > > > > command, it gives me the following errors: > > > > > > > > > > [cid:image003.png at 01D8EB93.7C17E410] > > > > > > > > > > Could you kindly look into this and help me with this? Your > > > > > prompt > > > > response will highly be appreciated. > > > > > > > > > > The runs look fine. > > > > > > > > > > The test should not try to attach the debugger. Do you have > > > > > that in the > > > > PETSC_OPTIONS env variable? > > > > > > > > > > Thanks, > > > > > > > > > > Matt > > > > > > > > > > Thank you! > > > > > Mohammad Ali > > > > > Researcher, Sejong University > > > > > > > > > > > > > > > -- > > > > > What most experimenters take for granted before they begin > > > > > their > > > > experiments is infinitely more interesting than any results to > > > > which their experiments lead. > > > > > -- Norbert Wiener > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/< > > > > http://www.cse.buffalo.edu/~knepley/> > > > > > > > > > > > > > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: