From knepley at gmail.com Tue Nov 1 00:07:35 2022 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 1 Nov 2022 01:07:35 -0400 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2db12320-25ab-7911-4bb6-ff0195f5ffdc@mcs.anl.gov> <461d2b54-173d-95fa-6ad5-9ce81849871e@mcs.anl.gov> Message-ID: On Tue, Nov 1, 2022 at 12:36 AM Mohammad Ali Yaqteen wrote: > I installed the libraries i.e. Cygwin openmpi in its default folder. I > didn?t change anything. Now there is a folder of C:\cygwin64\lib\openmpi\ > which includes a file name *?cygompi_dbg_msgq.dll?*. > Can you compile any executable with mpicc? Thanks, Matt > > > Thanks > > Ali > > > > *From:* Matthew Knepley > *Sent:* Tuesday, November 1, 2022 1:26 PM > *To:* Mohammad Ali Yaqteen > *Cc:* petsc-users > *Subject:* Re: [petsc-users] PETSc Windows Installation > > > > On Tue, Nov 1, 2022 at 12:16 AM Mohammad Ali Yaqteen > wrote: > > I am unable to attach the configure.log file. Hence. I have copied the > following text after executing the command (less configure.log) in the > cygwin64 > > > > You can see at the end of the file that your "mpicc" does not work. The > link is broken, possibly because you moved directories after you installed > it. > > > > Thanks, > > > > Matt > > > > Executing: uname -s > stdout: CYGWIN_NT-10.0-19044 > > ============================================================================================= > Configuring PETSc to compile on your system > > ============================================================================================= > > > ================================================================================ > > ================================================================================ > Starting configure run at Tue, 01 Nov 2022 13:06:06 +0900 > Configure Options: --configModules=PETSc.Configure > --optionsModule=config.compilerOptions --with-cc=mpicc --with-cxx=mpicxx > --with-fc=mpif90 > Working directory: /home/SEJONG/petsc-3.18.1 > Machine platform: > uname_result(system='CYGWIN_NT-10.0-19044', node='DESKTOP-R1C768B', > release='3.3.6-341.x86_64', version='2022-09-05 11:15 UTC', > machine='x86_64') > Python version: > 3.9.10 (main, Jan 20 2022, 21:37:52) > [GCC 11.2.0] > > ================================================================================ > Environmental variables > USERDOMAIN=DESKTOP-R1C768B > OS=Windows_NT > COMMONPROGRAMFILES=C:\Program Files\Common Files > PROCESSOR_LEVEL=6 > PSModulePath=C:\Users\SEJONG\Documents\WindowsPowerShell\Modules;C:\Program > Files\WindowsPowerShell\Modules;C:\Windows\system32\WindowsPowerShell\v1.0\Modules > CommonProgramW6432=C:\Program Files\Common Files > CommonProgramFiles(x86)=C:\Program Files (x86)\Common Files > LANG=en_US.UTF-8 > TZ=Asia/Seoul > HOSTNAME=DESKTOP-R1C768B > PUBLIC=C:\Users\Public > OLDPWD=/home/SEJONG > USERNAME=SEJONG > LOGONSERVER=\\DESKTOP-R1C768B > PROCESSOR_ARCHITECTURE=AMD64 > LOCALAPPDATA=C:\Users\SEJONG\AppData\Local > COMPUTERNAME=DESKTOP-R1C768B > USER=SEJONG > !::=::\ > SYSTEMDRIVE=C: > USERPROFILE=C:\Users\SEJONG > PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.CPL > SYSTEMROOT=C:\Windows > USERDOMAIN_ROAMINGPROFILE=DESKTOP-R1C768B > OneDriveCommercial=C:\Users\SEJONG\OneDrive - Sejong University > PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 165 Stepping 5, GenuineIntel > GNUPLOT_LIB=C:\Program Files\gnuplot\demo;C:\Program > Files\gnuplot\demo\games;C:\Program Files\gnuplot\share > PWD=/home/SEJONG/petsc-3.18.1 > MSMPI_BIN=C:\Program Files\Microsoft MPI\Bin\ > HOME=/home/SEJONG > TMP=/tmp > OneDrive=C:\Users\SEJONG\OneDrive - Sejong University > ZES_ENABLE_SYSMAN=1 > !C:=C:\cygwin64\bin > PROCESSOR_REVISION=a505 > PROFILEREAD=true > PROMPT=$P$G > NUMBER_OF_PROCESSORS=16 > ProgramW6432=C:\Program Files > COMSPEC=C:\Windows\system32\cmd.exe > APPDATA=C:\Users\SEJONG\AppData\Roaming > SHELL=/bin/bash > TERM=xterm-256color > WINDIR=C:\Windows > ProgramData=C:\ProgramData > SHLVL=1 > PRINTER=\\210.107.220.119\HP Color LaserJet Pro MFP M377 PCL 6 > PROGRAMFILES=C:\Program Files > ALLUSERSPROFILE=C:\ProgramData > TEMP=/tmp > DriverData=C:\Windows\System32\Drivers\DriverData > SESSIONNAME=Console > ProgramFiles(x86)=C:\Program Files (x86) > PATH=/usr/local/bin:/usr/bin:/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program > Files/Microsoft > MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program > Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL > Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client > SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program > Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program > Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program > Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft > VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools:/usr/lib/lapack > PS1=\[\e]0;\w\a\]\n\[\e[32m\]\u@\h \[\e[33m\]\w\[\e[0m\]\n\$ > HOMEDRIVE=C: > INFOPATH=/usr/local/info:/usr/share/info:/usr/info > HOMEPATH=\Users\SEJONG > ORIGINAL_PATH=/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program > Files/Microsoft > MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program > Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL > Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client > SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program > Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program > Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program > Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft > VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools > EXECIGNORE=*.dll > _=./configure > Files in path provided by default path > /usr/local/bin: > /usr/bin: addftinfo.exe addr2line.exe apropos ar.exe arch.exe as.exe > ash.exe awk b2sum.exe base32.exe base64.exe basename.exe basenc.exe > bash.exe bashbug bomtool.exe bunzip2.exe bzcat.exe bzcmp bzdiff bzegrep > bzfgrep bzgrep bzip2.exe bzip2recover.exe bzless bzmore c++.exe c++filt.exe > c89 c99 ca-legacy cal.exe captoinfo cat.exe catman.exe cc ccmake.exe > chattr.exe chcon.exe chgrp.exe chmod.exe chown.exe chroot.exe chrt.exe > cksum.exe clear.exe cmake.exe cmp.exe col.exe colcrt.exe colrm.exe > column.exe comm.exe cp.exe cpack.exe cpp.exe csplit.exe ctest.exe cut.exe > cygarchive-13.dll cygargp-0.dll cygatomic-1.dll cygattr-1.dll > cygblkid-1.dll cygbrotlicommon-1.dll cygbrotlidec-1.dll cygbz2-1.dll > cygcheck.exe cygcom_err-2.dll cygcrypt-2.dll cygcrypto-1.1.dll > cygcurl-4.dll cygdb-5.3.dll cygdb_cxx-5.3.dll cygdb_sql-5.3.dll > cygedit-0.dll cygevent-2-1-7.dll cygevent_core-2-1-7.dll > cygevent_extra-2-1-7.dll cygevent_openssl-2-1-7.dll > cygevent_pthreads-2-1-7.dll cygexpat-1.dll cygfdisk-1.dll cygffi-6.dll > cygfido2-1.dll cygformw-10.dll cyggc-1.dll cyggcc_s-seh-1.dll cyggdbm-6.dll > cyggdbm_compat-4.dll cyggfortran-5.dll cyggmp-10.dll cyggomp-1.dll > cyggsasl-7.dll cyggssapi_krb5-2.dll cygguile-2.2-1.dll cyghistory7.dll > cyghwloc-15.dll cygiconv-2.dll cygidn-12.dll cygidn2-0.dll cygintl-8.dll > cygisl-23.dll cygjsoncpp-25.dll cygk5crypto-3.dll cygkrb5-3.dll > cygkrb5support-0.dll cyglber-2-4-2.dll cyglber-2.dll cygldap-2-4-2.dll > cygldap-2.dll cygldap_r-2-4-2.dll cygltdl-7.dll cyglz4-1.dll cyglzma-5.dll > cyglzo2-2.dll cygmagic-1.dll cygman-2-11-0.dll cygmandb-2-11-0.dll > cygmenuw-10.dll cygmpc-3.dll cygmpfr-6.dll cygmpi-40.dll > cygmpi_mpifh-40.dll cygmpi_usempif08-40.dll cygmpi_usempi_ignore_tkr-40.dll > cygncursesw-10.dll cygnghttp2-14.dll cygntlm-0.dll cygopen-pal-40.dll > cygopen-rte-40.dll cygp11-kit-0.dll cygpanelw-10.dll cygpath.exe > cygpcre2-8-0.dll cygperl5_32.dll cygpipeline-1.dll cygpkgconf-4.dll > cygpopt-0.dll cygpsl-5.dll cygquadmath-0.dll cygreadline7.dll > cygrhash-0.dll cygrunsrv.exe cygsasl2-3.dll cygserver-config > cygsigsegv-2.dll cygsmartcols-1.dll cygsqlite3-0.dll cygssh2-1.dll > cygssl-1.1.dll cygstart.exe cygstdc++-6.dll cygtasn1-6.dll cygticw-10.dll > cygunistring-2.dll cyguuid-1.dll cyguv-1.dll cygwin-console-helper.exe > cygwin1.dll cygxml2-2.dll cygxxhash-0.dll cygz.dll cygzstd-1.dll dash.exe > date.exe dd.exe df.exe diff.exe diff3.exe dir.exe dircolors.exe dirname.exe > dlltool.exe dllwrap.exe dnsdomainname domainname du.exe dumper.exe echo.exe > editrights.exe egrep elfedit.exe env.exe eqn.exe eqn2graph ex expand.exe > expr.exe f95 factor.exe false.exe fgrep fido2-assert.exe fido2-cred.exe > fido2-token.exe file.exe find.exe flock.exe fmt.exe fold.exe g++.exe > gawk-5.1.1.exe gawk.exe gcc-ar.exe gcc-nm.exe gcc-ranlib.exe gcc.exe > gcov-dump.exe gcov-tool.exe gcov.exe gdiffmk gencat.exe getconf.exe > getent.exe getfacl.exe getopt.exe gfortran.exe git-receive-pack.exe > git-shell.exe git-upload-archive.exe git-upload-pack.exe git.exe gkill.exe > gmondump.exe gprof.exe grap2graph grep.exe grn.exe grodvi.exe groff.exe > grolbp.exe grolj4.exe grops.exe grotty.exe groups.exe gunzip gzexe gzip.exe > head.exe hexdump.exe hostid.exe hostname.exe hpftodit.exe > i686-w64-mingw32-pkg-config id.exe indxbib.exe info.exe infocmp.exe > infotocap install-info.exe install.exe ipcmk.exe ipcrm.exe ipcs.exe > isosize.exe join.exe kill.exe lastlog.exe ld.bfd.exe ld.exe ldd.exe ldh.exe > less.exe lessecho.exe lesskey.exe lexgrog.exe libpython3.9.dll > link-cygin.exe lkbib.exe ln.exe locale.exe locate.exe logger.exe login.exe > logname.exe look.exe lookbib.exe ls.exe lsattr.exe lto-dump.exe lzcat lzcmp > lzdiff lzegrep lzfgrep lzgrep lzless lzma lzmadec.exe lzmainfo.exe lzmore > make-dummy-cert make.exe man-recode.exe man.exe mandb.exe manpath.exe > mcookie.exe md5sum.exe minidumper.exe mintheme mintty.exe mkdir.exe > mkfifo.exe mkgroup.exe mknod.exe mkpasswd.exe mkshortcut.exe mktemp.exe > more.exe mount.exe mpic++ mpicc mpicxx mpiexec mpif77 mpif90 mpifort mpirun > mv.exe namei.exe neqn nice.exe nl.exe nm.exe nohup.exe nproc.exe nroff > numfmt.exe objcopy.exe objdump.exe od.exe ompi-clean ompi-server > ompi_info.exe opal_wrapper.exe openssl.exe orte-clean.exe orte-info.exe > orte-server.exe ortecc orted.exe orterun.exe p11-kit.exe passwd.exe > paste.exe pathchk.exe pdfroff peflags.exe peflagsall perl.exe > perl5.32.1.exe pfbtops.exe pg.exe pic.exe pic2graph pinky.exe pip3 pip3.9 > pkg-config pkgconf.exe pldd.exe post-grohtml.exe pr.exe pre-grohtml.exe > preconv.exe printenv.exe printf.exe profiler.exe ps.exe ptx.exe pwd.exe > pydoc3 pydoc3.9 python python3 python3.9.exe pzstd.exe ranlib.exe > readelf.exe readlink.exe readshortcut.exe realpath.exe rebase-trigger > rebase.exe rebaseall rebaselst refer.exe regtool.exe rename.exe > renew-dummy-cert renice.exe reset rev.exe rm.exe rmdir.exe rsync-ssl > rsync.exe run.exe runcon.exe rvi rview scalar.exe scp.exe script.exe > scriptreplay.exe sdiff.exe sed.exe seq.exe setfacl.exe setmetamode.exe > setsid.exe sftp.exe sh.exe sha1sum.exe sha224sum.exe sha256sum.exe > sha384sum.exe sha512sum.exe shred.exe shuf.exe size.exe sleep.exe slogin > soelim.exe sort.exe split.exe ssh-add.exe ssh-agent.exe ssh-copy-id > ssh-host-config ssh-keygen.exe ssh-keyscan.exe ssh-user-config ssh.exe > ssp.exe stat.exe stdbuf.exe strace.exe strings.exe strip.exe stty.exe > sum.exe sync.exe tabs.exe tac.exe tail.exe tar.exe taskset.exe tbl.exe > tee.exe test.exe tfmtodit.exe tic.exe timeout.exe toe.exe touch.exe > tput.exe tr.exe troff.exe true.exe truncate.exe trust.exe tset.exe > tsort.exe tty.exe tzselect tzset.exe ul.exe umount.exe uname.exe > unexpand.exe uniq.exe unlink.exe unlzma unxz unzstd update-ca-trust > update-crypto-policies updatedb users.exe uuidgen.exe uuidparse.exe > vdir.exe vi.exe view wc.exe whatis.exe whereis.exe which.exe who.exe > whoami.exe windmc.exe windres.exe x86_64-pc-cygwin-c++.exe > x86_64-pc-cygwin-g++.exe x86_64-pc-cygwin-gcc-11.exe > x86_64-pc-cygwin-gcc-ar.exe x86_64-pc-cygwin-gcc-nm.exe > x86_64-pc-cygwin-gcc-ranlib.exe x86_64-pc-cygwin-gcc.exe > x86_64-pc-cygwin-gfortran.exe x86_64-pc-cygwin-pkg-config > x86_64-w64-mingw32-pkg-config xargs.exe xmlcatalog.exe xmllint.exe xz.exe > xzcat xzcmp xzdec.exe xzdiff xzegrep xzfgrep xzgrep xzless xzmore yes.exe > zcat zcmp zdiff zdump.exe zegrep zfgrep zforce zgrep zless zmore znew > zstd.exe zstdcat zstdgrep zstdless zstdmt [.exe > /cygdrive/c/SIMULIA/Commands: abaqus.bat abq2018.bat > abq_cae_open.bat abq_odb_open.bat > /cygdrive/c/Program Files/Microsoft MPI/Bin: mpiexec.exe > mpitrace.man smpd.exe > provthrd.dll provtool.exe ProximityCommon.dll ProximityCommonPal.dll > ProximityRtapiPal.dll ProximityService.dll ProximityServicePal.dll > ProximityToast ProximityUxHost.exe prproc.exe prvdmofcomp.dll psapi.dll > pscript.sep PSHED.DLL psisdecd.dll psisrndr.ax PSModuleDis > coveryProvider.dll psmodulediscoveryprovider.mof PsmServiceExtHost.dll > psmsrv.dll psr.exe pstask.dll pstorec.dll pt-BR pt-PT ptpprov.dll > puiapi.dll puiobj.dll PushToInstall.dll pwlauncher.dll pwlauncher.exe > pwrshplugin.dll pwsso.dll qappsrv.exe qasf.dll qcap.dll qdv. > dll qdvd.dll qedit.dll qedwipes.dll qmgr.dll qprocess.exe > QualityUpdateAssistant.dll quartz.dll Query.dll query.exe > QuickActionsDataModel.dll quickassist.exe QuietHours.dll quser.exe > qwave.dll qwinsta.exe RacEngn.dll racpldlg.dll radardt.dll radarrs.dll > RADCUI.dll ra > s rasadhlp.dll rasapi32.dll rasauto.dll rasautou.exe raschap.dll > raschapext.dll rasctrnm.h rasctrs.dll rascustom.dll rasdiag.dll rasdial.exe > rasdlg.dll raserver.exe rasgcw.dll rasman.dll rasmans.dll rasmbmgr.dll > RasMediaManager.dll RASMM.dll rasmontr.dll rasphone.exe > rasplap.dll rasppp.dll rastapi.dll rastls.dll rastlsext.dll RasToast > rdbui.dll rdpbase.dll rdpcfgex.dll rdpclip.exe rdpcore.dll rdpcorets.dll > rdpcredentialprovider.dll rdpencom.dll rdpendp.dll rdpinit.exe rdpinput.exe > rdpnano.dll RdpRelayTransport.dll RdpSa.exe RdpS > aProxy.exe RdpSaPs.dll RdpSaUacHelper.exe rdpserverbase.dll > rdpsharercom.dll rdpshell.exe rdpsign.exe rdpudd.dll rdpviewerax.dll > rdrleakdiag.exe RDSAppXHelper.dll rdsdwmdr.dll rdsxvmaudio.dll > rdvvmtransport.dll RDXService.dll RDXTaskFactory.dll ReAgent.dll ReAgentc.e > xe ReAgentTask.dll recdisc.exe recover.exe Recovery recovery.dll > RecoveryDrive.exe refsutil.exe reg.exe regapi.dll RegCtrl.dll regedt32.exe > regidle.dll regini.exe Register-CimProvider.exe regsvc.dll regsvr32.exe > reguwpapi.dll ReInfo.dll rekeywiz.exe relog.exe RelPost > .exe RemoteAppLifetimeManager.exe RemoteAppLifetimeManagerProxyStub.dll > remoteaudioendpoint.dll remotepg.dll RemotePosWorker.exe remotesp.tsp > RemoteSystemToastIcon.contrast-white.png RemoteSystemToastIcon.png > RemoteWipeCSP.dll RemovableMediaProvisioningPlugin.dll Rem > oveDeviceContextHandler.dll RemoveDeviceElevated.dll rendezvousSession.tlb > repair-bde.exe replace.exe ReportingCSP.dll RESAMPLEDMO.DLL ResBParser.dll > reset.exe reseteng.dll ResetEngine.dll ResetEngine.exe ResetEngOnline.dll > resmon.exe ResourceMapper.dll ResourcePolic > yClient.dll ResourcePolicyServer.dll ResPriHMImageList > ResPriHMImageListLowCost ResPriImageList ResPriImageListLowCost > RestartManager.mof RestartManagerUninstall.mof > RestartNowPower_80.contrast-black.png RestartNowPower_80.contrast-white.png > RestartNowPower_80.png Re > startTonight_80.png RestartTonight_80_contrast-black.png > RestartTonight_80_contrast-white.png restore resutils.dll rgb9rast.dll > Ribbons.scr riched20.dll riched32.dll rilproxy.dll RjvMDMConfig.dll > RMActivate.exe RMActivate_isv.exe RMActivate_ssp.exe RMActivate_ssp_isv > .exe RMapi.dll rmclient.dll RmClient.exe RMSRoamingSecurity.dll > rmttpmvscmgrsvr.exe rnr20.dll ro-RO RoamingSecurity.dll Robocopy.exe > rometadata.dll RotMgr.dll ROUTE.EXE RpcEpMap.dll rpchttp.dll RpcNs4.dll > rpcnsh.dll RpcPing.exe rpcrt4.dll RpcRtRemote.dll rpcss.dll rr > installer.exe rsaenh.dll rshx32.dll rsop.msc RstMwEventLogMsg.dll > RstrtMgr.dll rstrui.exe RtCOM64.dll RtDataProc64.dll rtffilt.dll > RtkApi64U.dll RtkAudUService64.exe RtkCfg64.dll rtm.dll rtmcodecs.dll > RTMediaFrame.dll rtmmvrortc.dll rtmpal.dll rtmpltfm.dll rtutils.dl > l RTWorkQ.dll ru-RU RuleBasedDS.dll runas.exe rundll32.exe > runexehelper.exe RunLegacyCPLElevated.exe runonce.exe RuntimeBroker.exe > rwinsta.exe samcli.dll samlib.dll samsrv.dll Samsung sas.dll sbe.dll > sbeio.dll sberes.dll sbservicetrigger.dll sc.exe ScanPlugin.dll sca > nsetting.dll SCardBi.dll SCardDlg.dll SCardSvr.dll ScavengeSpace.xml > scavengeui.dll ScDeviceEnum.dll scecli.dll scesrv.dll schannel.dll > schedcli.dll schedsvc.dll ScheduleTime_80.contrast-black.png > ScheduleTime_80.contrast-white.png ScheduleTime_80.png schtasks.exe sc > ksp.dll scripto.dll ScriptRunner.exe scrnsave.scr scrobj.dll scrptadm.dll > scrrun.dll sdbinst.exe sdchange.exe sdclt.exe sdcpl.dll SDDS.dll > sdengin2.dll SDFHost.dll sdhcinst.dll sdiageng.dll sdiagnhost.exe > sdiagprv.dll sdiagschd.dll sdohlp.dll sdrsvc.dll sdshext.dll S > earch.ProtocolHandler.MAPI2.dll SearchFilterHost.exe SearchFolder.dll > SearchIndexer.exe SearchProtocolHost.exe SebBackgroundManagerPolicy.dll > SecConfig.efi SecEdit.exe sechost.dll secinit.exe seclogon.dll secpol.msc > secproc.dll secproc_isv.dll secproc_ssp.dll secproc > _ssp_isv.dll secur32.dll SecureAssessmentHandlers.dll SecureBootUpdates > securekernel.exe SecureTimeAggregator.dll security.dll > SecurityAndMaintenance.png SecurityAndMaintenance_Alert.png > SecurityAndMaintenance_Error.png SecurityCenterBroker.dll > SecurityCenterBrokerPS > .dll SecurityHealthAgent.dll SecurityHealthHost.exe > SecurityHealthProxyStub.dll SecurityHealthService.exe SecurityHealthSSO.dll > SecurityHealthSystray.exe sedplugins.dll SEMgrPS.dll SEMgrSvc.dll > sendmail.dll Sens.dll SensApi.dll SensorDataService.exe SensorPerformance > Events.dll SensorsApi.dll SensorsClassExtension.dll SensorsCpl.dll > SensorService.dll SensorsNativeApi.dll SensorsNativeApi.V2.dll > SensorsUtilsV2.dll sensrsvc.dll serialui.dll services.exe services.msc > ServicingUAPI.dll serwvdrv.dll SessEnv.dll sessionmsg.exe setbcdlo > cale.dll sethc.exe SetNetworkLocation.dll SetNetworkLocationFlyout.dll > SetProxyCredential.dll setspn.exe SettingMonitor.dll settings.dat > SettingsEnvironment.Desktop.dll SettingsExtensibilityHandlers.dll > SettingsHandlers_Accessibility.dll SettingsHandlers_AnalogShell. > dll SettingsHandlers_AppControl.dll SettingsHandlers_AppExecutionAlias.dll > SettingsHandlers_AssignedAccess.dll SettingsHandlers_Authentication.dll > SettingsHandlers_BackgroundApps.dll SettingsHandlers_BatteryUsage.dll > SettingsHandlers_BrowserDeclutter.dll SettingsHand > lers_CapabilityAccess.dll SettingsHandlers_Clipboard.dll > SettingsHandlers_ClosedCaptioning.dll > SettingsHandlers_ContentDeliveryManager.dll SettingsHandlers_Cortana.dll > SettingsHandlers_Devices.dll SettingsHandlers_Display.dll > SettingsHandlers_Flights.dll SettingsHand > lers_Fonts.dll SettingsHandlers_ForceSync.dll SettingsHandlers_Gaming.dll > SettingsHandlers_Geolocation.dll SettingsHandlers_Gpu.dll > SettingsHandlers_HoloLens_Environment.dll SettingsHandlers_IME.dll > SettingsHandlers_InkingTypingPrivacy.dll SettingsHandlers_InputPerso > nalization.dll SettingsHandlers_Language.dll > SettingsHandlers_ManagePhone.dll SettingsHandlers_Maps.dll > SettingsHandlers_Mouse.dll SettingsHandlers_Notifications.dll > SettingsHandlers_nt.dll SettingsHandlers_OneCore_BatterySaver.dll > SettingsHandlers_OneCore_PowerAndSl > eep.dll SettingsHandlers_OneDriveBackup.dll > SettingsHandlers_OptionalFeatures.dll SettingsHandlers_PCDisplay.dll > SettingsHandlers_Pen.dll SettingsHandlers_QuickActions.dll > SettingsHandlers_Region.dll SettingsHandlers_SharedExperiences_Rome.dll > SettingsHandlers_SIUF.d > ll SettingsHandlers_SpeechPrivacy.dll SettingsHandlers_Startup.dll > SettingsHandlers_StorageSense.dll SettingsHandlers_Troubleshoot.dll > SettingsHandlers_User.dll SettingsHandlers_UserAccount.dll > SettingsHandlers_UserExperience.dll SettingsHandlers_WorkAccess.dll Setti > ngSync.dll SettingSyncCore.dll SettingSyncDownloadHelper.dll > SettingSyncHost.exe setup setupapi.dll setupcl.dll setupcl.exe setupcln.dll > setupetw.dll setupugc.exe setx.exe sfc.dll sfc.exe sfc_os.dll Sgrm > SgrmBroker.exe SgrmEnclave.dll SgrmEnclave_secure.dll SgrmLpac. > exe shacct.dll shacctprofile.dll SharedPCCSP.dll SharedRealitySvc.dll > ShareHost.dll sharemediacpl.dll SHCore.dll shdocvw.dll shell32.dll > ShellAppRuntime.exe ShellCommonCommonProxyStub.dll ShellExperiences > shellstyle.dll shfolder.dll shgina.dll ShiftJIS.uce shimeng.dl > l shimgvw.dll shlwapi.dll shpafact.dll shrpubw.exe shsetup.dll shsvcs.dll > shunimpl.dll shutdown.exe shutdownext.dll shutdownux.dll shwebsvc.dll si-lk > signdrv.dll sigverif.exe SIHClient.exe sihost.exe SimAuth.dll SimCfg.dll > simpdata.tlb sk-SK skci.dll sl-SI slc.dll sl > cext.dll SleepStudy SlideToShutDown.exe slmgr slmgr.vbs slui.exe slwga.dll > SmallRoom.bin SmartCardBackgroundPolicy.dll SmartcardCredentialProvider.dll > SmartCardSimulator.dll smartscreen.exe smartscreenps.dll SMBHelperClass.dll > smbwmiv2.dll SMI SmiEngine.dll smphost.d > ll SmsRouterSvc.dll smss.exe SndVol.exe SndVolSSO.dll SnippingTool.exe > snmpapi.dll snmptrap.exe Snooze_80.contrast-black.png > Snooze_80.contrast-white.png Snooze_80.png socialapis.dll softkbd.dll > softpub.dll sort.exe SortServer2003Compat.dll SortWindows61.dll SortWind > ows62.dll SortWindows64.dll SortWindows6Compat.dll SpaceAgent.exe > spacebridge.dll SpaceControl.dll spaceman.exe SpatialAudioLicenseSrv.exe > SpatializerApo.dll SpatialStore.dll spbcd.dll > SpeakersSystemToastIcon.contrast-white.png SpeakersSystemToastIcon.png > Spectrum.ex > e SpectrumSyncClient.dll Speech SpeechPal.dll Speech_OneCore spfileq.dll > spinf.dll spmpm.dll spnet.dll spool spoolss.dll spoolsv.exe spopk.dll spp > spp.dll sppc.dll sppcext.dll sppcomapi.dll sppcommdlg.dll SppExtComObj.Exe > sppinst.dll sppnp.dll sppobjs.dll sppsvc.exe > sppui sppwinob.dll sppwmi.dll spwinsat.dll spwizeng.dll spwizimg.dll > spwizres.dll spwmp.dll SqlServerSpatial130.dll SqlServerSpatial150.dll > sqlsrv32.dll sqlsrv32.rll sqmapi.dll sr-Latn-RS srchadmin.dll srclient.dll > srcore.dll srdelayed.exe SrEvents.dll SRH.dll srhelp > er.dll srm.dll srmclient.dll srmlib.dll srms-apr-v.dat srms-apr.dat > srms.dat srmscan.dll srmshell.dll srmstormod.dll srmtrace.dll srm_ps.dll > srpapi.dll SrpUxNativeSnapIn.dll srrstr.dll SrTasks.exe sru srumapi.dll > srumsvc.dll srvcli.dll srvsvc.dll srwmi.dll sscore.dll > sscoreext.dll ssdm.dll ssdpapi.dll ssdpsrv.dll sspicli.dll sspisrv.dll > SSShim.dll ssText3d.scr sstpsvc.dll StartTileData.dll Startupscan.dll > StateRepository.Core.dll stclient.dll stdole2.tlb stdole32.tlb sti.dll > sti_ci.dll stobject.dll StorageContextHandler.dll Stor > ageUsage.dll storagewmi.dll storagewmi_passthru.dll stordiag.exe > storewuauth.dll Storprop.dll StorSvc.dll streamci.dll > StringFeedbackEngine.dll StructuredQuery.dll SubRange.uce subst.exe sud.dll > sv-SE SvBannerBackground.png svchost.exe svf.dll svsvc.dll SwitcherDataM > odel.dll swprv.dll sxproxy.dll sxs.dll sxshared.dll sxssrv.dll > sxsstore.dll sxstrace.exe SyncAppvPublishingServer.exe > SyncAppvPublishingServer.vbs SyncCenter.dll SyncController.dll SyncHost.exe > SyncHostps.dll SyncInfrastructure.dll SyncInfrastructureps.dll SyncProxy. > dll Syncreg.dll SyncRes.dll SyncSettings.dll syncutil.dll sysclass.dll > sysdm.cpl SysFxUI.dll sysmain.dll sysmon.ocx sysntfy.dll Sysprep > sysprint.sep sysprtj.sep SysResetErr.exe syssetup.dll systemcpl.dll > SystemEventsBrokerClient.dll SystemEventsBrokerServer.dll syste > minfo.exe SystemPropertiesAdvanced.exe SystemPropertiesComputerName.exe > SystemPropertiesDataExecutionPrevention.exe SystemPropertiesHardware.exe > SystemPropertiesPerformance.exe SystemPropertiesProtection.exe > SystemPropertiesRemote.exe systemreset.exe SystemResetPlatf > orm SystemSettings.DataModel.dll > SystemSettings.DeviceEncryptionHandlers.dll SystemSettings.Handlers.dll > SystemSettings.SettingsExtensibility.dll > SystemSettings.UserAccountsHandlers.dll SystemSettingsAdminFlows.exe > SystemSettingsBroker.exe SystemSettingsRemoveDevice. > exe SystemSettingsThresholdAdminFlowUI.dll SystemSupportInfo.dll > SystemUWPLauncher.exe systray.exe t2embed.dll ta-in ta-lk Tabbtn.dll > TabbtnEx.dll tabcal.exe TabletPC.cpl TabSvc.dll takeown.exe tapi3.dll > tapi32.dll tapilua.dll TapiMigPlugin.dll tapiperf.dll tapisrv.d > ll TapiSysprep.dll tapiui.dll TapiUnattend.exe tar.exe TaskApis.dll > taskbarcpl.dll taskcomp.dll TaskFlowDataEngine.dll taskhostw.exe > taskkill.exe tasklist.exe Taskmgr.exe Tasks taskschd.dll taskschd.msc > TaskSchdPS.dll tbauth.dll tbs.dll tcblaunch.exe tcbloader.dll tc > msetup.exe tcpbidi.xml tcpipcfg.dll tcpmib.dll tcpmon.dll tcpmon.ini > tcpmonui.dll TCPSVCS.EXE tdc.ocx tdh.dll TDLMigration.dll > TEEManagement64.dll telephon.cpl TelephonyInteractiveUser.dll > TelephonyInteractiveUserRes.dll tellib.dll > TempSignedLicenseExchangeTask.dll T > enantRestrictionsPlugin.dll termmgr.dll termsrv.dll tetheringclient.dll > tetheringconfigsp.dll TetheringIeProvider.dll TetheringMgr.dll > tetheringservice.dll TetheringStation.dll TextInputFramework.dll > TextInputMethodFormatter.dll TextShaping.dll th-TH themecpl.dll The > mes.SsfDownload.ScheduledTask.dll themeservice.dll themeui.dll > ThirdPartyNoticesBySHS.txt threadpoolwinrt.dll thumbcache.dll > ThumbnailExtractionHost.exe ti-et tier2punctuations.dll > TieringEngineProxy.dll TieringEngineService.exe TileDataRepository.dll > TimeBrokerClien > t.dll TimeBrokerServer.dll timedate.cpl TimeDateMUICallback.dll > timeout.exe timesync.dll TimeSyncTask.dll TKCtrl2k64.sys TKFsAv64.sys > TKFsFt64.sys TKFWFV.inf TKFWFV64.cat TKFWFV64.sys tkfwvt64.sys > TKIdsVt64.sys TKPcFtCb64.sys TKPcFtCb64.sys_ TKPcFtHk64.sys TKRgAc2k64 > .sys TKRgFtXp64.sys TKTool2k.sys TKTool2k64.sys tlscsp.dll > tokenbinding.dll TokenBroker.dll TokenBrokerCookies.exe TokenBrokerUI.dll > tpm.msc TpmCertResources.dll tpmcompc.dll TpmCoreProvisioning.dll > TpmInit.exe TpmTasks.dll TpmTool.exe tpmvsc.dll tpmvscmgr.exe tpmvsc > mgrsvr.exe tquery.dll tr-TR tracerpt.exe TRACERT.EXE traffic.dll > TransformPPSToWlan.xslt TransformPPSToWlanCredentials.xslt > TransliterationRanker.dll TransportDSA.dll tree.com trie.dll trkwks.dll > TrustedSignalCredProv.dll tsbyuv.dll tscfgwmi.dll tscon.exe tsdiscon.ex > e TSErrRedir.dll tsf3gip.dll tsgqec.dll tskill.exe tsmf.dll TSpkg.dll > tspubwmi.dll TSSessionUX.dll tssrvlic.dll TSTheme.exe > TsUsbGDCoInstaller.dll TsUsbRedirectionGroupPolicyExtension.dll > TSWbPrxy.exe TSWorkspace.dll TsWpfWrp.exe ttdinject.exe ttdloader.dll > ttdplm.dl > l ttdrecord.dll ttdrecordcpu.dll TtlsAuth.dll TtlsCfg.dll TtlsExt.dll > tttracer.exe tvratings.dll twext.dll twinapi.appcore.dll twinapi.dll > twinui.appcore.dll twinui.dll twinui.pcshell.dll txflog.dll txfw32.dll > typeperf.exe tzautoupdate.dll tzres.dll tzsync.exe tzsync > res.dll tzutil.exe ubpm.dll ucmhc.dll ucrtbase.dll ucrtbased.dll > ucrtbase_clr0400.dll ucrtbase_enclave.dll ucsvc.exe udhisapi.dll uDWM.dll > UefiCsp.dll UevAgentPolicyGenerator.exe UevAppMonitor.exe > UevAppMonitor.exe.config UevCustomActionTypes.tlb UevTemplateBaselineG > enerator.exe UevTemplateConfigItemGenerator.exe uexfat.dll ufat.dll > UiaManager.dll UIAnimation.dll UIAutomationCore.dll uicom.dll > UIManagerBrokerps.dll UIMgrBroker.exe uireng.dll UIRibbon.dll > UIRibbonRes.dll uk-UA ulib.dll umb.dll umdmxfrm.dll umpdc.dll umpnpmgr.dll > umpo-overrides.dll umpo.dll umpoext.dll umpowmi.dll umrdp.dll unattend.dll > unenrollhook.dll unimdm.tsp unimdmat.dll uniplat.dll Unistore.dll > unlodctr.exe UNP unregmp2.exe untfs.dll UpdateAgent.dll updatecsp.dll > UpdateDeploymentProvider.dll UpdateHeartbeat.dll updatep > olicy.dll upfc.exe UpgradeResultsUI.exe upnp.dll upnpcont.exe upnphost.dll > UPPrinterInstaller.exe UPPrinterInstallsCSP.dll upshared.dll uReFS.dll > uReFSv1.dll ureg.dll url.dll urlmon.dll UsbCApi.dll usbceip.dll usbmon.dll > usbperf.dll UsbPmApi.dll UsbSettingsHandlers.d > ll UsbTask.dll usbui.dll user32.dll UserAccountBroker.exe > UserAccountControlSettings.dll UserAccountControlSettings.exe > useractivitybroker.dll usercpl.dll UserDataAccessRes.dll > UserDataAccountApis.dll UserDataLanguageUtil.dll > UserDataPlatformHelperUtil.dll UserDataSe > rvice.dll UserDataTimeUtil.dll UserDataTypeHelperUtil.dll > UserDeviceRegistration.dll UserDeviceRegistration.Ngc.dll userenv.dll > userinit.exe userinitext.dll UserLanguageProfileCallback.dll usermgr.dll > usermgrcli.dll UserMgrProxy.dll usk.rs usoapi.dll UsoClient.exe us > ocoreps.dll usocoreworker.exe usosvc.dll usp10.dll ustprov.dll > UtcDecoderHost.exe UtcManaged.dll utcutil.dll utildll.dll Utilman.exe > uudf.dll UvcModel.dll uwfcfgmgmt.dll uwfcsp.dll uwfservicingapi.dll > UXInit.dll uxlib.dll uxlibres.dll uxtheme.dll vac.dll VAN.dll Vaul > t.dll VaultCDS.dll vaultcli.dll VaultCmd.exe VaultRoaming.dll vaultsvc.dll > VBICodec.ax vbisurf.ax vbsapi.dll vbscript.dll vbssysprep.dll > vcamp120.dll vcamp140.dll vcamp140d.dll VCardParser.dll vccorlib110.dll > vccorlib120.dll vccorlib140.dll vccorlib140d.dll vcomp100. > dll vcomp110.dll vcomp120.dll vcomp140.dll vcomp140d.dll vcruntime140.dll > vcruntime140d.dll vcruntime140_1.dll vcruntime140_1d.dll > vcruntime140_clr0400.dll vds.exe vdsbas.dll vdsdyn.dll vdsldr.exe > vdsutil.dll vdsvd.dll vds_ps.dll verclsid.exe verifier.dll verifier.ex > e verifiergui.exe version.dll vertdll.dll vfbasics.dll vfcompat.dll > vfcuzz.dll vfluapriv.dll vfnet.dll vfntlmless.dll vfnws.dll vfprint.dll > vfprintpthelper.dll vfrdvcompat.dll vfuprov.dll vfwwdm32.dll VhfUm.dll > vid.dll vidcap.ax VideoHandlers.dll VIDRESZR.DLL virtdis > k.dll VirtualMonitorManager.dll VmApplicationHealthMonitorProxy.dll > vmbuspipe.dll vmdevicehost.dll vmictimeprovider.dll vmrdvcore.dll > VocabRoamingHandler.dll VoiceActivationManager.dll VoipRT.dll vpnike.dll > vpnikeapi.dll VpnSohDesktop.dll VPNv2CSP.dll vrfcore.dll Vsc > MgrPS.dll vscover160.dll VSD3DWARPDebug.dll VsGraphicsCapture.dll > VsGraphicsDesktopEngine.exe VsGraphicsExperiment.dll VsGraphicsHelper.dll > VsGraphicsProxyStub.dll VsGraphicsRemoteEngine.exe vsjitdebugger.exe > VSPerf160.dll vssadmin.exe vssapi.dll vsstrace.dll VSSVC.e > xe vss_ps.dll vulkan-1-999-0-0-0.dll vulkan-1.dll > vulkaninfo-1-999-0-0-0.exe vulkaninfo.exe w32time.dll w32tm.exe w32topl.dll > WaaSAssessment.dll WaaSMedicAgent.exe WaaSMedicCapsule.dll WaaSMedicPS.dll > WaaSMedicSvc.dll WABSyncProvider.dll waitfor.exe WalletBackgroundS > erviceProxy.dll WalletProxy.dll WalletService.dll WallpaperHost.exe > wavemsp.dll wbadmin.exe wbem wbemcomn.dll wbengine.exe wbiosrvc.dll wci.dll > wcimage.dll wcmapi.dll wcmcsp.dll wcmsvc.dll WCN WcnApi.dll wcncsvc.dll > WcnEapAuthProxy.dll WcnEapPeerProxy.dll WcnNetsh.dl > l wcnwiz.dll wc_storage.dll wdc.dll WDI wdi.dll wdigest.dll wdmaud.drv > wdscore.dll WdsUnattendTemplate.xml WEB.rs webauthn.dll WebcamUi.dll > webcheck.dll WebClnt.dll webio.dll webplatstorageserver.dll > WebRuntimeManager.dll webservices.dll Websocket.dll wecapi.dll wecs > vc.dll wecutil.exe wephostsvc.dll wer.dll werconcpl.dll wercplsupport.dll > werdiagcontroller.dll WerEnc.dll weretw.dll WerFault.exe WerFaultSecure.exe > wermgr.exe wersvc.dll werui.dll wevtapi.dll wevtfwd.dll wevtsvc.dll > wevtutil.exe wextract.exe WF.msc wfapigp.dll wfdp > rov.dll WFDSConMgr.dll WFDSConMgrSvc.dll WfHC.dll WFS.exe WFSR.dll > whealogr.dll where.exe whhelper.dll whoami.exe wiaacmgr.exe wiaaut.dll > wiadefui.dll wiadss.dll WiaExtensionHost64.dll wiarpc.dll > wiascanprofiles.dll wiaservc.dll wiashext.dll wiatrace.dll wiawow64.exe > WiFiCloudStore.dll WiFiConfigSP.dll wifidatacapabilityhandler.dll > WiFiDisplay.dll wifinetworkmanager.dll wifitask.exe WimBootCompress.ini > wimgapi.dll wimserv.exe win32appinventorycsp.dll > Win32AppSettingsProvider.dll Win32CompatibilityAppraiserCSP.dll win32k.sys > win3 > 2kbase.sys win32kfull.sys win32kns.sys win32spl.dll win32u.dll > Win32_DeviceGuard.dll winbio.dll WinBioDatabase WinBioDataModel.dll > WinBioDataModelOOBE.exe winbioext.dll WinBioPlugIns winbrand.dll > wincorlib.dll wincredprovider.dll wincredui.dll WindowManagement.dll Wi > ndowManagementAPI.dll Windows.AccountsControl.dll > Windows.AI.MachineLearning.dll Windows.AI.MachineLearning.Preview.dll > Windows.ApplicationModel.Background.SystemEventsBroker.dll > Windows.ApplicationModel.Background.TimeBroker.dll > Windows.ApplicationModel.Conversation > alAgent.dll > windows.applicationmodel.conversationalagent.internal.proxystub.dll > windows.applicationmodel.conversationalagent.proxystub.dll > Windows.ApplicationModel.Core.dll windows.applicationmodel.datatransfer.dll > Windows.ApplicationModel.dll Windows.ApplicationMode > l.LockScreen.dll Windows.ApplicationModel.Store.dll > Windows.ApplicationModel.Store.Preview.DOSettings.dll > Windows.ApplicationModel.Store.TestingFramework.dll > Windows.ApplicationModel.Wallet.dll Windows.CloudStore.dll > Windows.CloudStore.Schema.DesktopShell.dll Windows > .CloudStore.Schema.Shell.dll Windows.Cortana.Desktop.dll > Windows.Cortana.OneCore.dll Windows.Cortana.ProxyStub.dll > Windows.Data.Activities.dll Windows.Data.Pdf.dll > Windows.Devices.AllJoyn.dll Windows.Devices.Background.dll > Windows.Devices.Background.ps.dll Windows.De > vices.Bluetooth.dll Windows.Devices.Custom.dll > Windows.Devices.Custom.ps.dll Windows.Devices.Enumeration.dll > Windows.Devices.Haptics.dll Windows.Devices.HumanInterfaceDevice.dll > Windows.Devices.Lights.dll Windows.Devices.LowLevel.dll > Windows.Devices.Midi.dll Windows. > Devices.Perception.dll Windows.Devices.Picker.dll > Windows.Devices.PointOfService.dll Windows.Devices.Portable.dll > Windows.Devices.Printers.dll Windows.Devices.Printers.Extensions.dll > Windows.Devices.Radios.dll Windows.Devices.Scanners.dll > Windows.Devices.Sensors.dll > Windows.Devices.SerialCommunication.dll Windows.Devices.SmartCards.dll > Windows.Devices.SmartCards.Phone.dll Windows.Devices.Usb.dll > Windows.Devices.WiFi.dll Windows.Devices.WiFiDirect.dll Windows.Energy.dll > Windows.FileExplorer.Common.dll Windows.Gaming.Input.dll Win > dows.Gaming.Preview.dll Windows.Gaming.UI.GameBar.dll > Windows.Gaming.XboxLive.Storage.dll Windows.Globalization.dll > Windows.Globalization.Fontgroups.dll > Windows.Globalization.PhoneNumberFormatting.dll > Windows.Graphics.Display.BrightnessOverride.dll Windows.Graphics.D > isplay.DisplayEnhancementOverride.dll Windows.Graphics.dll > Windows.Graphics.Printing.3D.dll Windows.Graphics.Printing.dll > Windows.Graphics.Printing.Workflow.dll > Windows.Graphics.Printing.Workflow.Native.dll Windows.Help.Runtime.dll > windows.immersiveshell.serviceprovi > der.dll Windows.Internal.AdaptiveCards.XamlCardRenderer.dll > Windows.Internal.Bluetooth.dll Windows.Internal.CapturePicker.Desktop.dll > Windows.Internal.CapturePicker.dll Windows.Internal.Devices.Sensors.dll > Windows.Internal.Feedback.Analog.dll Windows.Internal.Feedbac > k.Analog.ProxyStub.dll > Windows.Internal.Graphics.Display.DisplayColorManagement.dll > Windows.Internal.Graphics.Display.DisplayEnhancementManagement.dll > Windows.Internal.Management.dll > Windows.Internal.Management.SecureAssessment.dll > Windows.Internal.PlatformExtension. > DevicePickerExperience.dll > Windows.Internal.PlatformExtension.MiracastBannerExperience.dll > Windows.Internal.PredictionUnit.dll > Windows.Internal.Security.Attestation.DeviceAttestation.dll > Windows.Internal.SecurityMitigationsBroker.dll > Windows.Internal.Shell.Broker.dll > windows.internal.shellcommon.AccountsControlExperience.dll > windows.internal.shellcommon.AppResolverModal.dll > Windows.Internal.ShellCommon.Broker.dll > windows.internal.shellcommon.FilePickerExperienceMEM.dll > Windows.Internal.ShellCommon.PrintExperience.dll windows.int > ernal.shellcommon.shareexperience.dll > windows.internal.shellcommon.TokenBrokerModal.dll > Windows.Internal.Signals.dll Windows.Internal.System.UserProfile.dll > Windows.Internal.Taskbar.dll > Windows.Internal.UI.BioEnrollment.ProxyStub.dll > Windows.Internal.UI.Logon.ProxySt > ub.dll Windows.Internal.UI.Shell.WindowTabManager.dll > Windows.Management.EnrollmentStatusTracking.ConfigProvider.dll > Windows.Management.InprocObjects.dll > Windows.Management.ModernDeployment.ConfigProviders.dll > Windows.Management.Provisioning.ProxyStub.dll Windows.Man > agement.SecureAssessment.CfgProvider.dll > Windows.Management.SecureAssessment.Diagnostics.dll > Windows.Management.Service.dll Windows.Management.Workplace.dll > Windows.Management.Workplace.WorkplaceSettings.dll Windows.Media.Audio.dll > Windows.Media.BackgroundMediaPlayba > ck.dll Windows.Media.BackgroundPlayback.exe Windows.Media.Devices.dll > Windows.Media.dll Windows.Media.Editing.dll Windows.Media.FaceAnalysis.dll > Windows.Media.Import.dll Windows.Media.MediaControl.dll > Windows.Media.MixedRealityCapture.dll Windows.Media.Ocr.dll Window > s.Media.Playback.BackgroundMediaPlayer.dll > Windows.Media.Playback.MediaPlayer.dll Windows.Media.Playback.ProxyStub.dll > Windows.Media.Protection.PlayReady.dll Windows.Media.Renewal.dll > Windows.Media.Speech.dll Windows.Media.Speech.UXRes.dll > Windows.Media.Streaming.dll > Windows.Media.Streaming.ps.dll Windows.Mirage.dll > Windows.Mirage.Internal.Capture.Pipeline.ProxyStub.dll > Windows.Mirage.Internal.dll > Windows.Networking.BackgroundTransfer.BackgroundManagerPolicy.dll > Windows.Networking.BackgroundTransfer.ContentPrefetchTask.dll Windo > ws.Networking.BackgroundTransfer.dll Windows.Networking.Connectivity.dll > Windows.Networking.dll Windows.Networking.HostName.dll > Windows.Networking.NetworkOperators.ESim.dll > Windows.Networking.NetworkOperators.HotspotAuthentication.dll > Windows.Networking.Proximity.dll > Windows.Networking.ServiceDiscovery.Dnssd.dll > Windows.Networking.Sockets.PushEnabledApplication.dll > Windows.Networking.UX.EapRequestHandler.dll Windows.Networking.Vpn.dll > Windows.Networking.XboxLive.ProxyStub.dll Windows.Payments.dll > Windows.Perception.Stub.dll Wind > ows.Security.Authentication.Identity.Provider.dll > Windows.Security.Authentication.OnlineId.dll > Windows.Security.Authentication.Web.Core.dll > Windows.Security.Credentials.UI.CredentialPicker.dll > Windows.Security.Credentials.UI.UserConsentVerifier.dll Windows.Security.I > ntegrity.dll Windows.Services.TargetedContent.dll > Windows.SharedPC.AccountManager.dll Windows.SharedPC.CredentialProvider.dll > Windows.Shell.BlueLightReduction.dll Windows.Shell.ServiceHostBuilder.dll > Windows.Shell.StartLayoutPopulationEvents.dll Windows.StateReposito > ry.dll Windows.StateRepositoryBroker.dll Windows.StateRepositoryClient.dll > Windows.StateRepositoryCore.dll Windows.StateRepositoryPS.dll > Windows.StateRepositoryUpgrade.dll Windows.Storage.ApplicationData.dll > Windows.Storage.Compression.dll windows.storage.dll Windows > .Storage.OneCore.dll Windows.Storage.Search.dll > Windows.System.Diagnostics.dll > Windows.System.Diagnostics.Telemetry.PlatformTelemetryClient.dll > Windows.System.Diagnostics.TraceReporting.PlatformDiagnosticActions.dll > Windows.System.Launcher.dll Windows.System.Profile. > HardwareId.dll > Windows.System.Profile.PlatformDiagnosticsAndUsageDataSettings.dll > Windows.System.Profile.RetailInfo.dll Windows.System.Profile.SystemId.dll > Windows.System.Profile.SystemManufacturers.dll > Windows.System.RemoteDesktop.dll Windows.System.SystemManagement > .dll Windows.System.UserDeviceAssociation.dll > Windows.System.UserProfile.DiagnosticsSettings.dll > Windows.UI.Accessibility.dll Windows.UI.AppDefaults.dll > Windows.UI.BioFeedback.dll Windows.UI.BlockedShutdown.dll > Windows.UI.Core.TextInput.dll Windows.UI.Cred.dll Window > s.UI.CredDialogController.dll Windows.UI.dll Windows.UI.FileExplorer.dll > Windows.UI.Immersive.dll Windows.UI.Input.Inking.Analysis.dll > Windows.UI.Input.Inking.dll Windows.UI.Internal.Input.ExpressiveInput.dll > Windows.UI.Internal.Input.ExpressiveInput.Resource.dll Win > dows.UI.Logon.dll Windows.UI.NetworkUXController.dll > Windows.UI.PicturePassword.dll Windows.UI.Search.dll Windows.UI.Shell.dll > Windows.UI.Shell.Internal.AdaptiveCards.dll Windows.UI.Storage.dll > Windows.UI.Xaml.Controls.dll Windows.UI.Xaml.dll Windows.UI.Xaml.InkContr > ols.dll Windows.UI.Xaml.Maps.dll Windows.UI.Xaml.Phone.dll > Windows.UI.Xaml.Resources.19h1.dll Windows.UI.Xaml.Resources.Common.dll > Windows.UI.Xaml.Resources.rs1.dll Windows.UI.Xaml.Resources.rs2.dll > Windows.UI.Xaml.Resources.rs3.dll Windows.UI.Xaml.Resources.rs4.dll > Windows.UI.Xaml.Resources.rs5.dll Windows.UI.Xaml.Resources.th.dll > Windows.UI.Xaml.Resources.win81.dll Windows.UI.Xaml.Resources.win8rtm.dll > Windows.UI.XamlHost.dll Windows.WARP.JITService.dll > Windows.WARP.JITService.exe Windows.Web.Diagnostics.dll Windows.Web.dll Wi > ndows.Web.Http.dll WindowsActionDialog.exe WindowsCodecs.dll > WindowsCodecsExt.dll WindowsCodecsRaw.dll WindowsCodecsRaw.txt > WindowsDefaultHeatProcessor.dll windowsdefenderapplicationguardcsp.dll > WindowsInternal.ComposableShell.ComposerFramework.dll WindowsInternal.Co > mposableShell.DesktopHosting.dll > WindowsInternal.Shell.CompUiActivation.dll WindowsIoTCsp.dll > windowslivelogin.dll WindowsManagementServiceWinRt.ProxyStub.dll > windowsperformancerecordercontrol.dll WindowsPowerShell > WindowsSecurityIcon.png windowsudk.shellcommon.dll W > indowsUpdateElevatedInstaller.exe winethc.dll winevt WinFax.dll > winhttp.dll winhttpcom.dll WinHvEmulation.dll WinHvPlatform.dll wininet.dll > wininetlui.dll wininit.exe wininitext.dll winipcfile.dll winipcsecproc.dll > winipsec.dll winjson.dll Winlangdb.dll winload.efi w > inload.exe winlogon.exe winlogonext.dll winmde.dll WinMetadata winml.dll > winmm.dll winmmbase.dll winmsipc.dll WinMsoIrmProtector.dll winnlsres.dll > winnsi.dll WinOpcIrmProtector.dll WinREAgent.dll winresume.efi > winresume.exe winrm winrm.cmd winrm.vbs winrnr.dll winrs. > exe winrscmd.dll winrshost.exe winrsmgr.dll winrssrv.dll > WinRTNetMUAHostServer.exe WinRtTracing.dll WinSAT.exe WinSATAPI.dll > WinSCard.dll WinSetupUI.dll winshfhc.dll winsku.dll winsockhc.dll > winspool.drv winsqlite3.dll WINSRPC.DLL winsrv.dll winsrvext.dll winsta.dll > WinSync.dll WinSyncMetastore.dll WinSyncProviders.dll wintrust.dll > WinTypes.dll winusb.dll winver.exe WiredNetworkCSP.dll wisp.dll > witnesswmiv2provider.dll wkscli.dll wkspbroker.exe wkspbrokerAx.dll > wksprt.exe wksprtPS.dll wkssvc.dll wlanapi.dll wlancfg.dll WLanConn. > dll wlandlg.dll wlanext.exe wlangpui.dll WLanHC.dll wlanhlp.dll > WlanMediaManager.dll WlanMM.dll wlanmsm.dll wlanpref.dll > WlanRadioManager.dll wlansec.dll wlansvc.dll wlansvcpal.dll wlanui.dll > wlanutil.dll Wldap32.dll wldp.dll wlgpclnt.dll wlidcli.dll wlidcredprov.dll > wlidfdp.dll wlidnsp.dll wlidprov.dll wlidres.dll wlidsvc.dll wlrmdr.exe > WMADMOD.DLL WMADMOE.DLL WMALFXGFXDSP.dll WMASF.DLL wmcodecdspps.dll > wmdmlog.dll wmdmps.dll wmdrmsdk.dll wmerror.dll wmi.dll wmiclnt.dll > wmicmiplugin.dll wmidcom.dll wmidx.dll WmiMgmt.msc wmiprop > .dll wmitomi.dll WMNetMgr.dll wmp.dll WMPDMC.exe WmpDui.dll wmpdxm.dll > wmpeffects.dll WMPhoto.dll wmploc.DLL wmpps.dll wmpshell.dll wmsgapi.dll > WMSPDMOD.DLL WMSPDMOE.DLL WMVCORE.DLL WMVDECOD.DLL wmvdspa.dll WMVENCOD.DLL > WMVSDECD.DLL WMVSENCD.DLL WMVXENCD.DLL WofTasks > .dll WofUtil.dll WordBreakers.dll WorkFolders.exe WorkfoldersControl.dll > WorkFoldersGPExt.dll WorkFoldersRes.dll WorkFoldersShell.dll > workfolderssvc.dll wosc.dll wow64.dll wow64cpu.dll wow64win.dll > wowreg32.exe WpAXHolder.dll wpbcreds.dll Wpc.dll WpcApi.dll wpcatltoa > st.png WpcDesktopMonSvc.dll WpcMon.exe wpcmon.png WpcProxyStubs.dll > WpcRefreshTask.dll WpcTok.exe WpcWebFilter.dll wpdbusenum.dll WpdMtp.dll > WpdMtpUS.dll wpdshext.dll WPDShextAutoplay.exe WPDShServiceObj.dll > WPDSp.dll wpd_ci.dll wpnapps.dll wpnclient.dll wpncore.dll > wpninprc.dll wpnpinst.exe wpnprv.dll wpnservice.dll wpnsruprov.dll > WpnUserService.dll WpPortingLibrary.dll WppRecorderUM.dll wpr.config.xml > wpr.exe WPTaskScheduler.dll wpx.dll write.exe ws2help.dll ws2_32.dll > wscadminui.exe wscapi.dll wscinterop.dll wscisvif.dll WSCl > ient.dll WSCollect.exe wscproxystub.dll wscript.exe wscsvc.dll wscui.cpl > WSDApi.dll wsdchngr.dll WSDPrintProxy.DLL WsdProviderUtil.dll > WSDScanProxy.dll wsecedit.dll wsepno.dll wshbth.dll wshcon.dll wshelper.dll > wshext.dll wshhyperv.dll wship6.dll wshom.ocx wshqos.dll > wshrm.dll WSHTCPIP.DLL wshunix.dll wsl.exe wslapi.dll WsmAgent.dll > wsmanconfig_schema.xml WSManHTTPConfig.exe WSManMigrationPlugin.dll > WsmAuto.dll wsmplpxy.dll wsmprovhost.exe WsmPty.xsl WsmRes.dll WsmSvc.dll > WsmTxt.xsl WsmWmiPl.dll wsnmp32.dll wsock32.dll wsplib.dl > l wsp_fs.dll wsp_health.dll wsp_sr.dll wsqmcons.exe WSReset.exe > WSTPager.ax wtsapi32.dll wuapi.dll wuapihost.exe wuauclt.exe wuaueng.dll > wuceffects.dll WUDFCoinstaller.dll WUDFCompanionHost.exe WUDFHost.exe > WUDFPlatform.dll WudfSMCClassExt.dll WUDFx.dll WUDFx02000.dl > l wudriver.dll wups.dll wups2.dll wusa.exe wuuhext.dll > wuuhosdeployment.dll wvc.dll WwaApi.dll WwaExt.dll WWAHost.exe WWanAPI.dll > wwancfg.dll wwanconn.dll WWanHC.dll wwanmm.dll Wwanpref.dll wwanprotdim.dll > WwanRadioManager.dll wwansvc.dll wwapi.dll XamlTileRender.dll XAudio2_8.dll > XAudio2_9.dll XblAuthManager.dll XblAuthManagerProxy.dll > XblAuthTokenBrokerExt.dll XblGameSave.dll XblGameSaveExt.dll > XblGameSaveProxy.dll XblGameSaveTask.exe XboxGipRadioManager.dll > xboxgipsvc.dll xboxgipsynthetic.dll XboxNetApiSvc.dll xcopy.exe > XInput1_4.dll XInput9_1_0.dll XInputUap.dll xmlfilter.dll xmllite.dll > xmlprovi.dll xolehlp.dll XpsDocumentTargetPrint.dll XpsGdiConverter.dll > XpsPrint.dll xpspushlayer.dll XpsRasterService.dll xpsservices.dll > XpsToPclmConverter.dll XpsToPwgrConverter.dll xwizard.dtd xwizard.exe > xwizards.dll xwreg.dll xwtpdui.dll xwtpw32.dll X_80.contrast-black.png > X_80.contrast-white.png X_80.png ze_loader.dll ze_tracing_layer.dll > ze_validation_layer.dll zh-CN zh-TW zipcontainer.dll zipfldr.dll > ztrace_maps.dll > /cygdrive/c/Windows: addins AhnInst.log appcompat Application Data > apppatch AppReadiness assembly bcastdvr bfsvc.exe > BitLockerDiscoveryVolumeContents Boot bootstat.dat Branding CbsTemp > Containers CSC Cursors debug diagnostics DiagTrack DigitalLocker Downloaded > Program Files DtcInstall.log ELAMBKUP en-US explorer.exe Fonts > GameBarPresenceWriter gethelp_audiotroubleshooter_latestpackage.zip > Globalization Help HelpPane.exe hh.exe hipiw.dll IdentityCRL > ImageSAFERSvc.exe IME IMGSF50Svc.exe ImmersiveControlPanel INF InputMethod > Installer ko-KR L2Schemas LanguageOverlayCache LiveKernelReports Logs > lsasetup.log Media mib.bin Microsoft.NET Migration ModemLogs notepad.exe > OCR Offline Web Pages Panther Performance PFRO.log PLA PolicyDefinitions > Prefetch PrintDialog Professional.xml Provisioning > regedit.exe Registration RemotePackages rescache Resources RtlExUpd.dll > SchCache schemas security ServiceProfiles ServiceState servicing Setup > setupact.log setuperr.log ShellComponents ShellExperiences SHELLNEW SKB > SoftwareDistribution Speech Speech_OneCore splwow64. > exe System system.ini System32 SystemApps SystemResources SystemTemp > SysWOW64 TAPI Tasks Temp TempInst tracing twain_32 twain_32.dll Vss WaaS > Web win.ini WindowsShell.Manifest WindowsUpdate.log winhlp32.exe WinSxS > WMSysPr9.prx write.exe > /cygdrive/c/Windows/System32/Wbem: aeinv.mof AgentWmi.mof > AgentWmiUninstall.mof appbackgroundtask.dll appbackgroundtask.mof > appbackgroundtask_uninstall.mof AuditRsop.mof authfwcfg.mof AutoRecover > bcd.mof BthMtpEnum.mof cimdmtf.mof cimwin32.dll cimwin32.mof CIWm > i.mof classlog.mof cli.mof cliegaliases.mof ddp.mof dimsjob.mof > dimsroam.mof DMWmiBridgeProv.dll DMWmiBridgeProv.mof DMWmiBridgeProv1.dll > DMWmiBridgeProv1.mof DMWmiBridgeProv1_Uninstall.mof > DMWmiBridgeProv_Uninstall.mof dnsclientcim.dll dnsclientcim.mof > dnsclientpspr > ovider.dll dnsclientpsprovider.mof dnsclientpsprovider_Uninstall.mof > drvinst.mof DscCore.mof DscCoreConfProv.mof dscproxy.mof Dscpspluginwkr.dll > DscTimer.mof dsprov.dll dsprov.mof eaimeapi.mof EmbeddedLockdownWmi.dll > embeddedlockdownwmi.mof embeddedlockdownwmi_Uninst > all.mof en en-US esscli.dll EventTracingManagement.dll > EventTracingManagement.mof fastprox.dll fdPHost.mof fdrespub.mof fdSSDP.mof > fdWNet.mof fdWSD.mof filetrace.mof firewallapi.mof > FolderRedirectionWMIProvider.mof FunDisc.mof fwcfg.mof hbaapi.mof > hnetcfg.mof IMAPIv2 > -Base.mof IMAPIv2-FileSystemSupport.mof IMAPIv2-LegacyShim.mof interop.mof > IpmiDTrc.mof ipmiprr.dll ipmiprv.dll ipmiprv.mof IpmiPTrc.mof ipsecsvc.mof > iscsidsc.mof iscsihba.mof iscsiprf.mof iscsirem.mof iscsiwmiv2.mof > iscsiwmiv2_uninstall.mof kerberos.mof ko ko-KR Krn > lProv.dll krnlprov.mof L2SecHC.mof lltdio.mof lltdsvc.mof Logs lsasrv.mof > mblctr.mof MDMAppProv.dll MDMAppProv.mof MDMAppProv_Uninstall.mof > MDMSettingsProv.dll MDMSettingsProv.mof MDMSettingsProv_Uninstall.mof > Microsoft-Windows-OfflineFiles.mof Microsoft-Windows-Remo > te-FileSystem.mof Microsoft.AppV.AppVClientWmi.dll > Microsoft.AppV.AppVClientWmi.mof Microsoft.Uev.AgentWmi.dll > Microsoft.Uev.ManagedAgentWmi.mof > Microsoft.Uev.ManagedAgentWmiUninstall.mof mispace.mof > mispace_uninstall.mof mmc.mof MMFUtil.dll MOF mofcomp.exe mofd.dll > mofinstall.dll mountmgr.mof mpeval.mof mpsdrv.mof mpssvc.mof msdtcwmi.dll > MsDtcWmi.mof msfeeds.mof msfeedsbs.mof msi.mof msiprov.dll msiscsi.mof > MsNetImPlatform.mof mstsc.mof mstscax.mof msv1_0.mof mswmdm.mof NCProv.dll > ncprov.mof ncsi.mof ndisimplatcim.dll ndistrace > .mof NetAdapterCim.dll NetAdapterCim.mof NetAdapterCimTrace.mof > NetAdapterCimTraceUninstall.mof NetAdapterCim_uninstall.mof netdacim.dll > netdacim.mof netdacim_uninstall.mof NetEventPacketCapture.dll > NetEventPacketCapture.mof NetEventPacketCapture_uninstall.mof netncc > im.dll netnccim.mof netnccim_uninstall.mof NetPeerDistCim.dll > NetPeerDistCim.mof NetPeerDistCim_uninstall.mof netprofm.mof > NetSwitchTeam.mof netswitchteamcim.dll NetTCPIP.dll NetTCPIP.mof > NetTCPIP_Uninstall.mof netttcim.dll netttcim.mof netttcim_uninstall.mof > network > itemfactory.mof newdev.mof nlasvc.mof nlmcim.dll nlmcim.mof > nlmcim_uninstall.mof nlsvc.mof npivwmi.mof nshipsec.mof ntevt.dll ntevt.mof > ntfs.mof OfflineFilesConfigurationWmiProvider.mof > OfflineFilesConfigurationWmiProvider_Uninstall.mof > OfflineFilesWmiProvider.mof Of > flineFilesWmiProvider_Uninstall.mof p2p-mesh.mof p2p-pnrp.mof > pcsvDevice.mof pcsvDevice_Uninstall.mof Performance PNPXAssoc.mof > PolicMan.dll PolicMan.mof polproc.mof polprocl.mof polprou.mof polstore.mof > portabledeviceapi.mof portabledeviceclassextension.mof portable > deviceconnectapi.mof portabledevicetypes.mof portabledevicewiacompat.mof > powermeterprovider.mof PowerPolicyProvider.mof ppcRsopCompSchema.mof > ppcRsopUserSchema.mof PrintFilterPipelineSvc.mof > PrintManagementProvider.dll PrintManagementProvider.mof > PrintManagementProvider_Uninstall.mof profileassociationprovider.mof > PS_MMAgent.mof qmgr.mof qoswmi.dll qoswmi.mof qoswmitrc.mof > qoswmitrc_uninstall.mof qoswmi_uninstall.mof RacWmiProv.dll RacWmiProv.mof > rawxml.xsl rdpendp.mof rdpinit.mof rdpshell.mof refs.mof refsv1.mof > regevent.mof Remove.Microsoft.AppV.AppvClientWmi.mof repdrvfs.dll > Repository rsop.mof rspndr.mof samsrv.mof scersop.mof schannel.mof > schedprov.dll SchedProv.mof scm.mof scrcons.exe scrcons.mof sdbus.mof > secrcw32.mof SensorsClassExtension.mof ServDeps.dll ServiceModel.mof > ServiceModel.mof.uninstall ServiceModel35.mof ServiceModel35.mof.uninstall > services.mof setupapi.mof SmbWitnessWmiv2Provider.mof smbwmiv2.mof > SMTPCons.dll smtpcons.mof sppwmi.mof sr.mof sstpsvc.mof stdprov.dll > storagewmi.mof storagewmi_passthru.mof storagewmi_passthru_uninstall.mof > storagewmi_uninstall.mof stortrace.mof subscrpt.mof system.mof tcpip.mof > texttable.xsl textvaluelist.xsl tmf tsallow.mof tscfgwmi.mof tsmf.mof > tspkg.mof umb.mof umbus.mof umpass.mof umpnpmgr.mof unsecapp.exe > UserProfileConfigurationWmiProvider.mof UserProfileWmiProvider.mof > UserStateWMIProvider.mof vds.mof vdswmi.dll viewprov.dll > vpnclientpsprovider.dll vpnclientpsprovider.mof > vpnclientpsprovider_Uninstall.mof vss.mof vsswmi.dll wbemcntl.dll > wbemcons.dll WBEMCons.mof wbemcore.dll wbemdisp.dll wbemdisp.tlb > wbemess.dll wbemprox.dll wbemsvc.dll wbemtest.exe wcncsvc.mof > WdacEtwProv.mof WdacWmiProv.dll WdacWmiProv.mof WdacWmiProv_Uninstall.mof > Wdf01000.mof Wdf01000Uninstall.mof wdigest.mof WFAPIGP.mof wfascim.dll > wfascim.mof wfascim_uninstall.mof WFP.MOF wfs.mof whqlprov.mof > Win32_DeviceGuard.mof Win32_EncryptableVolume.dll > win32_encryptablevolume.mof Win32_EncryptableVolumeUninstall.mof > win32_printer.mof Win32_Tpm.dll Win32_Tpm.mof wininit.mof winipsec.mof > winlogon.mof WinMgmt.exe WinMgmtR.dll Winsat.mof WinsatUninstall.mof > wlan.mof WLanHC.mof wmi.mof WMIADAP.exe WmiApRes.dll WmiApRpl.dll > WmiApSrv.exe WMIC.exe WMICOOKR.dll WmiDcPrv.dll wmipcima.dll wmipcima.mof > wmipdfs.dll wmipdfs.mof wmipdskq.dll wmipdskq.mof WmiPerfClass.dll > WmiPerfClass.mof WmiPerfInst.dll WmiPerfInst.mof WMIPICMP.dll wmipicmp.mof > WMIPIPRT.dll wmipiprt.mof WMIPJOBJ.dll wmipjobj.mof wmiprov.dll > WmiPrvSD.dll WmiPrvSE.exe WMIPSESS.dll wmipsess.mof WMIsvc.dll wmitimep.dll > wmitimep.mof wmiutils.dll WMI_Tracing.mof wmp.mof wmpnetwk.mof > wpdbusenum.mof wpdcomp.mof wpdfs.mof wpdmtp.mof wpdshext.mof > WPDShServiceObj.mof wpdsp.mof wpd_ci.mof wscenter.mof WsmAgent.mof > WsmAgentUninstall.mof WsmAuto.mof wsp_fs.mof wsp_fs_uninstall.mof > wsp_health.mof wsp_health_uninstall.mof wsp_sr.mof wsp_sr_uninstall.mof > WUDFx.mof Wudfx02000.mof Wudfx02000Uninstall.mof WUDFxUninstall.mof xml > xsl-mappings.xml xwizards.mof > /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0: > Certificate.format.ps1xml Diagnostics.Format.ps1xml > DotNetTypes.format.ps1xml en en-US Event.Format.ps1xml Examples > FileSystem.format.ps1xml getevent.types.ps1xml Help.format.ps1xml > HelpV3.format.ps1xml ko ko-KR Modules powershell.exe powershell.exe.config > PowerShellCore.format.ps1xml PowerShellTrace.format.ps1xml > powershell_ise.exe powershell_ise.exe.config PSEvents.dll pspluginwkr.dll > pwrshmsg.dll pwrshsip.dll Registry.format.ps1xml Schemas SessionConfig > types.ps1xml typesv3.ps1xml WSMan.Format.ps1xml > /cygdrive/c/Windows/System32/OpenSSH: scp.exe sftp.exe ssh-add.exe > ssh-agent.exe ssh-keygen.exe ssh-keyscan.exe ssh.exe > /cygdrive/c/Program Files/MATLAB/R2020b/bin: crash_analyzer.cfg > icutzdata lcdata.xml lcdata.xsd lcdata_utf8.xml m3iregistry matlab.exe > mex.bat mexext.bat util win32 win64 > /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn: > Resources SqlLocalDB.exe > /cygdrive/c/Program Files/Microsoft SQL Server/Client > SDK/ODBC/170/Tools/Binn: batchparser.dll bcp.exe Resources SQLCMD.EXE > xmlrw.dll > /cygdrive/c/Program Files/Git/cmd: git-gui.exe git-lfs.exe git.exe > gitk.exe start-ssh-agent.cmd start-ssh-pageant.cmd > Warning accessing /cygdrive/c/msys64/mingw64/bin gives errors: > [Errno 2] No such file or directory: '/cygdrive/c/msys64/mingw64/bin' > Warning accessing /cygdrive/c/msys64/usr/bin gives errors: [Errno 2] > No such file or directory: '/cygdrive/c/msys64/usr/bin' > /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 > asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll > cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll > clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll > CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll > dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll > HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config > llvm-symbolizer.exe LocalESPC.dll > Microsoft.Diagnostics.Tracing.EventSource.dll > Microsoft.VisualStudio.RemoteControl.dll > Microsoft.VisualStudio.Telemetry.dll > Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll > mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll > mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll > msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll > nmake.exe onecore perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll > pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe > System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe > VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll > vctip.exe xdcmake.exe xdcmake.exe.config > /cygdrive/c/Program Files/dotnet: dotnet.exe host LICENSE.txt packs > sdk shared templates ThirdPartyNotices.txt > /: bin Cygwin-Terminal.ico Cygwin.bat Cygwin.ico dev etc home lib > mpich-4.0.2 mpich-4.0.2.tar.gz sbin tmp usr var proc cygdrive > /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps: Backup > GameBarElevatedFT_Alias.exe Microsoft.DesktopAppInstaller_8wekyb3d8bbwe > Microsoft.MicrosoftEdge_8wekyb3d8bbwe Microsoft.SkypeApp_kzf8qxf38zg5c > Microsoft.XboxGamingOverlay_8wekyb3d8bbwe MicrosoftEdge.exe python.exe > python3.exe Skype.exe winget.exe > /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS > Code/bin: code code.cmd > /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 > asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll > cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll > clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll > CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll > dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll > HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config > llvm-symbolizer.exe LocalESPC.dll > Microsoft.Diagnostics.Tracing.EventSource.dll > Microsoft.VisualStudio.RemoteControl.dll > Microsoft.VisualStudio.Telemetry.dll > Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll > mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll > mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll > msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll > nmake.exe onecore perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll > pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe > System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe > VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll > vctip.exe xdcmake.exe xdcmake.exe.config > Warning accessing /cygdrive/c/Users/SEJONG/.dotnet/tools gives > errors: [Errno 2] No such file or directory: > '/cygdrive/c/Users/SEJONG/.dotnet/tools' > /usr/lib/lapack: cygblas-0.dll cyglapack-0.dll > > ============================================================================================= > TESTING: configureExternalPackagesDir from > config.framework(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py:1045) > Set alternative directory external packages are built in > serialEvaluation: initial cxxDialectRanges ('c++11', 'c++17') > serialEvaluation: new cxxDialectRanges ('c++11', 'c++17') > child config.utilities.macosFirewall took 0.000005 seconds > > ============================================================================================= > TESTING: configureDebuggers from > config.utilities.debuggers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/utilities/debuggers.py:20) > Find a default debugger and determine its arguments > Checking for program /usr/local/bin/gdb...not found > Checking for program /usr/bin/gdb...not found > Checking for program /cygdrive/c/SIMULIA/Commands/gdb...not found > Checking for program /cygdrive/c/Program Files/Microsoft > MPI/Bin/gdb...not found > Checking for program /cygdrive/c/Windows/system32/gdb...not found > Checking for program /cygdrive/c/Windows/gdb...not found > Checking for program /cygdrive/c/Windows/System32/Wbem/gdb...not found > Checking for program > /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/gdb...not found > Checking for program /cygdrive/c/Windows/System32/OpenSSH/gdb...not > found > Checking for program /cygdrive/c/Program > Files/MATLAB/R2020b/bin/gdb...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL > Server/130/Tools/Binn/gdb...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL > Server/Client SDK/ODBC/170/Tools/Binn/gdb...not found > Checking for program /cygdrive/c/Program Files/Git/cmd/gdb...not found > Checking for program /cygdrive/c/msys64/mingw64/bin/gdb...not found > Checking for program /cygdrive/c/msys64/usr/bin/gdb...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not > found > Checking for program /cygdrive/c/Program Files/dotnet/gdb...not found > Checking for program /gdb...not found > Checking for program > /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/gdb...not found > Checking for program > /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS > Code/bin/gdb...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not > found > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/gdb...not > found > Checking for program /usr/lib/lapack/gdb...not found > Checking for program /usr/local/bin/dbx...not found > Checking for program /usr/bin/dbx...not found > Checking for program /cygdrive/c/SIMULIA/Commands/dbx...not found > Checking for program /cygdrive/c/Program Files/Microsoft > MPI/Bin/dbx...not found > Checking for program /cygdrive/c/Windows/system32/dbx...not found > Checking for program /cygdrive/c/Windows/dbx...not found > Checking for program /cygdrive/c/Windows/System32/Wbem/dbx...not found > Checking for program > /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/dbx...not found > Checking for program /cygdrive/c/Windows/System32/OpenSSH/dbx...not > found > Checking for program /cygdrive/c/Program > Files/MATLAB/R2020b/bin/dbx...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL > Server/130/Tools/Binn/dbx...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL > Server/Client SDK/ODBC/170/Tools/Binn/dbx...not found > Checking for program /cygdrive/c/Program Files/Git/cmd/dbx...not found > Checking for program /cygdrive/c/msys64/mingw64/bin/dbx...not found > Checking for program /cygdrive/c/msys64/usr/bin/dbx...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not > found > Checking for program /cygdrive/c/Program Files/dotnet/dbx...not found > Checking for program /dbx...not found > Checking for program > /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/dbx...not found > Checking for program > /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS > Code/bin/dbx...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not > found > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/dbx...not > found > Checking for program /usr/lib/lapack/dbx...not found > Defined make macro "DSYMUTIL" to "true" > child config.utilities.debuggers took 0.014310 seconds > > ============================================================================================= > TESTING: configureDirectories from > PETSc.options.petscdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscdir.py:22) > Checks PETSC_DIR and sets if not set > PETSC_VERSION_RELEASE of 1 indicates the code is from a release > branch or a branch created from a release branch. > Version Information: > #define PETSC_VERSION_RELEASE 1 > #define PETSC_VERSION_MAJOR 3 > #define PETSC_VERSION_MINOR 18 > #define PETSC_VERSION_SUBMINOR 1 > #define PETSC_VERSION_DATE "Oct 26, 2022" > #define PETSC_VERSION_GIT "v3.18.1" > #define PETSC_VERSION_DATE_GIT "2022-10-26 07:57:29 -0500" > #define PETSC_VERSION_EQ(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_ PETSC_VERSION_EQ > #define PETSC_VERSION_LT(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_LE(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_GT(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_GE(MAJOR,MINOR,SUBMINOR) \ > child PETSc.options.petscdir took 0.015510 seconds > > ============================================================================================= > TESTING: getDatafilespath from > PETSc.options.dataFilesPath(/home/SEJONG/petsc-3.18.1/config/PETSc/options/dataFilesPath.py:29) > Checks what DATAFILESPATH should be > child PETSc.options.dataFilesPath took 0.002462 seconds > > ============================================================================================= > TESTING: configureGit from > config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:24) > Find the Git executable > Checking for program /usr/local/bin/git...not found > Checking for program /usr/bin/git...found > Defined make macro "GIT" to "git" > Executing: git --version > stdout: git version 2.38.1 > > ============================================================================================= > TESTING: configureMercurial from > config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:35) > Find the Mercurial executable > Checking for program /usr/local/bin/hg...not found > Checking for program /usr/bin/hg...not found > Checking for program /cygdrive/c/SIMULIA/Commands/hg...not found > Checking for program /cygdrive/c/Program Files/Microsoft > MPI/Bin/hg...not found > Checking for program /cygdrive/c/Windows/system32/hg...not found > Checking for program /cygdrive/c/Windows/hg...not found > Checking for program /cygdrive/c/Windows/System32/Wbem/hg...not found > Checking for program > /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/hg...not found > Checking for program /cygdrive/c/Windows/System32/OpenSSH/hg...not > found > Checking for program /cygdrive/c/Program > Files/MATLAB/R2020b/bin/hg...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL > Server/130/Tools/Binn/hg...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL > Server/Client SDK/ODBC/170/Tools/Binn/hg...not found > Checking for program /cygdrive/c/Program Files/Git/cmd/hg...not found > Checking for program /cygdrive/c/msys64/mingw64/bin/hg...not found > Checking for program /cygdrive/c/msys64/usr/bin/hg...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not > found > Checking for program /cygdrive/c/Program Files/dotnet/hg...not found > Checking for program /hg...not found > Checking for program > /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/hg...not found > Checking for program > /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS > Code/bin/hg...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual > Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not > found > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/hg...not > found > Checking for program /usr/lib/lapack/hg...not found > Checking for program > /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/hg...not found > child config.sourceControl took 0.121914 seconds > > ============================================================================================= > TESTING: configureInstallationMethod from > PETSc.options.petscclone(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscclone.py:20) > Determine if PETSc was obtained via git or a tarball > This is a tarball installation > child PETSc.options.petscclone took 0.003125 seconds > > ============================================================================================= > TESTING: setNativeArchitecture from > PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:29) > Forms the arch as GNU's configure would form it > > ============================================================================================= > TESTING: configureArchitecture from > PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:42) > Checks if PETSC_ARCH is set and sets it if not set > No previous hashfile found > Setting hashfile: > arch-mswin-c-debug/lib/petsc/conf/configure-hash > Deleting configure hash file: > arch-mswin-c-debug/lib/petsc/conf/configure-hash > Unable to delete configure hash file: > arch-mswin-c-debug/lib/petsc/conf/configure-hash > child PETSc.options.arch took 0.149094 seconds > > ============================================================================================= > TESTING: setInstallDir from > PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:31) > Set installDir to either prefix or if that is not set to > PETSC_DIR/PETSC_ARCH > Defined make macro "PREFIXDIR" to > "/home/SEJONG/petsc-3.18.1/arch-mswin-c-debug" > > ============================================================================================= > TESTING: saveReconfigure from > PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:76) > Save the configure options in a script in PETSC_ARCH/lib/petsc/conf so > the same configure may be easily re-run > > ============================================================================================= > TESTING: cleanConfDir from > PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:68) > Remove all the files from configuration directory for this PETSC_ARCH, > from --with-clean option > > ============================================================================================= > TESTING: configureInstallDir from > PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:52) > Makes installDir subdirectories if it does not exist for both prefix > install location and PETSc work install location > Changed persistence directory to > /home/SEJONG/petsc-3.18.1/arch-mswin-c-debug/lib/petsc/conf > > TESTING: restoreReconfigure from > PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:90) > If --with-clean was requested but restoring the reconfigure file was > requested then restore it > child PETSc.options.installDir took 0.006476 seconds > > ============================================================================================= > TESTING: setExternalPackagesDir from > PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:15) > Set location where external packages will be downloaded to > > ============================================================================================= > TESTING: cleanExternalpackagesDir from > PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:23) > Remove all downloaded external packages, from --with-clean > child PETSc.options.externalpackagesdir took 0.000990 seconds > > ============================================================================================= > TESTING: configureCLanguage from > PETSc.options.languages(/home/SEJONG/petsc-3.18.1/config/PETSc/options/languages.py:28) > Choose whether to compile the PETSc library using a C or C++ compiler > C language is C > Defined "CLANGUAGE_C" to "1" > Defined make macro "CLANGUAGE" to "C" > child PETSc.options.languages took 0.003172 seconds > > ============================================================================================= > TESTING: resetEnvCompilers from > config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2652) > Remove compilers from the shell environment so they do not interfer with > testing > > ============================================================================================= > TESTING: checkEnvCompilers from > config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2669) > Set configure compilers from the environment, from > -with-environment-variables > > ============================================================================================= > TESTING: checkMPICompilerOverride from > config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2622) > Check if --with-mpi-dir is used along with CC CXX or FC compiler options. > This usually prevents mpi compilers from being used - so issue a > warning > > ============================================================================================= > TESTING: requireMpiLdPath from > config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2643) > OpenMPI wrappers require LD_LIBRARY_PATH set > > ============================================================================================= > TESTING: checkInitialFlags from > config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:723) > Initialize the compiler and linker flags > Initialized CFLAGS to > Initialized CFLAGS to > Initialized LDFLAGS to > Initialized CUDAFLAGS to > Initialized CUDAFLAGS to > Initialized LDFLAGS to > Initialized HIPFLAGS to > Initialized HIPFLAGS to > Initialized LDFLAGS to > Initialized SYCLFLAGS to > Initialized SYCLFLAGS to > Initialized LDFLAGS to > Initialized CXXFLAGS to > Initialized CXX_CXXFLAGS to > Initialized LDFLAGS to > Initialized FFLAGS to > Initialized FFLAGS to > Initialized LDFLAGS to > Initialized CPPFLAGS to > Initialized FPPFLAGS to > Initialized CUDAPPFLAGS to -Wno-deprecated-gpu-targets > Initialized CXXPPFLAGS to > Initialized HIPPPFLAGS to > Initialized SYCLPPFLAGS to > Initialized CC_LINKER_FLAGS to [] > Initialized CXX_LINKER_FLAGS to [] > Initialized FC_LINKER_FLAGS to [] > Initialized CUDAC_LINKER_FLAGS to [] > Initialized HIPC_LINKER_FLAGS to [] > Initialized SYCLC_LINKER_FLAGS to [] > > TESTING: checkCCompiler from > config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:1341) > Locate a functional C compiler > Checking for program /usr/local/bin/mpicc...not found > Checking for program /usr/bin/mpicc...found > Defined make macro "CC" to "mpicc" > Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o > -I/tmp/petsc-uqt11yqc/config.setCompilers > /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c > Successful compile: > Source: > #include "confdefs.h" > #include "conffix.h" > > int main() { > ; > return 0; > } > > Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o > -I/tmp/petsc-uqt11yqc/config.setCompilers > /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c > Successful compile: > Source: > #include "confdefs.h" > #include "conffix.h" > > int main() { > ; > return 0; > } > > Executing: mpicc -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.exe > /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o > Possible ERROR while running linker: exit code 1 > stderr: > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -lhwloc: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -levent_core: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -levent_pthreads: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -lz: No such file or directory > collect2: error: ld returned 1 exit status > Linker output before filtering: > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -lhwloc: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -levent_core: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -levent_pthreads: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -lz: No such file or directory > collect2: error: ld returned 1 exit status > : > Linker output after filtering: > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -lhwloc: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -levent_core: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -levent_pthreads: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: > cannot find -lz: No such file or directory > collect2: error: ld returned 1 exit status: > Error testing C compiler: Cannot compile/link C with mpicc. > MPI compiler wrapper mpicc failed to compile > Executing: mpicc -show > stdout: gcc -L/usr/lib -lmpi -lopen-rte -lopen-pal -lhwloc -levent_core > -levent_pthreads -lz > MPI compiler wrapper mpicc is likely incorrect. > Use --with-mpi-dir to indicate an alternate MPI. > Deleting "CC" > > ******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > details): > > ------------------------------------------------------------------------------- > C compiler you provided with -with-cc=mpicc cannot be found or does not > work. > Cannot compile/link C with mpicc. > > ******************************************************************************* > File "/home/SEJONG/petsc-3.18.1/config/configure.py", line 461, in > petsc_configure > framework.configure(out = sys.stdout) > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", > line 1412, in configure > self.processChildren() > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", > line 1400, in processChildren > self.serialEvaluation(self.childGraph) > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", > line 1375, in serialEvaluation > child.configure() > File > "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line > 2712, in configure > self.executeTest(self.checkCCompiler) > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/base.py", line > 138, in executeTest > ret = test(*args,**kargs) > File > "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line > 1346, in checkCCompiler > for compiler in self.generateCCompilerGuesses(): > File > "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line > 1274, in generateCCompilerGuesses > raise RuntimeError('C compiler you provided with > -with-cc='+self.argDB['with-cc']+' cannot be found or does not > work.'+'\n'+self.mesg) > > ================================================================================ > Finishing configure run at Tue, 01 Nov 2022 13:06:09 +0900 > > -----Original Message----- > From: Satish Balay > Sent: Tuesday, November 1, 2022 11:36 AM > To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: RE: [petsc-users] PETSc Windows Installation > > you'll have to send configure.log for this failure > > Satish > > > On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > > > I have checked the required Cygwin openmpi libraries and they are all > installed. When I run ./configure --with-cc=mpicc --with-cxx=mpicxx > --with-fc=mpif90, it returns: > > > > $ ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > > > ============================================================================================= > > Configuring PETSc to compile on your system > > ====================================================================== > > ======================= > > TESTING: checkCCompiler from > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log > for details): > > ---------------------------------------------------------------------- > > --------- C compiler you provided with -with-cc=mpicc cannot be found > > or does not work. > > Cannot compile/link C with mpicc. > > > > As for the case of WSL2, I will try to install that on my PC. > > Meanwhile, could you please look into this issue > > > > Thank you > > > > Ali > > > > -----Original Message----- > > From: Satish Balay > > Sent: Monday, October 31, 2022 10:56 PM > > To: Satish Balay via petsc-users > > Cc: Matthew Knepley ; Mohammad Ali Yaqteen > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > BTW: If you have WSL2 on windows - it might be easier to build/use PETSc. > > > > Satish > > > > On Mon, 31 Oct 2022, Satish Balay via petsc-users wrote: > > > > > Make sure you have cygwin openmpi installed [and cywin blas/lapack] > > > > > > $ cygcheck -cd |grep openmpi > > > libopenmpi-devel 4.1.2-1 > > > libopenmpi40 4.1.2-1 > > > libopenmpifh40 4.1.2-1 > > > libopenmpiusef08_40 4.1.2-1 > > > libopenmpiusetkr40 4.1.2-1 > > > openmpi 4.1.2-1 > > > $ cygcheck -cd |grep lapack > > > liblapack-devel 3.10.1-1 > > > liblapack0 3.10.1-1 > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > --download-f2cblaslapack > > > > > > Should be: > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > i.e [default] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 [an > > > default cygwin blas/lapack] > > > > > > Satish > > > > > > > > > On Mon, 31 Oct 2022, Matthew Knepley wrote: > > > > > > > On Mon, Oct 31, 2022 at 1:56 AM Mohammad Ali Yaqteen > > > > > > > > wrote: > > > > > > > > > Dear Satish > > > > > > > > > > When I configure PETSc with (./configure --with-cc=gcc > > > > > --with-cxx=0 > > > > > --with-fc=0 --download-f2cblaslapack) it runs as I shared > > > > > initially which you said is not an issue anymore. But when I add > > > > > (--download-scalapack > > > > > --download-mumps) or configure with these later, it gives the > > > > > following > > > > > error: > > > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > > > > > > ============================================================================================= > > > > > Configuring PETSc to compile on your > > > > > system > > > > > > > > > > ================================================================ > > > > > == > > > > > =========================== > > > > > TESTING: FortranMPICheck from > > > > > > config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* > > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see > configure.log for > > > > > details): > > > > > > > > > > ---------------------------------------------------------------- > > > > > -- > > > > > ------------- Fortran error! mpi_init() could not be located! > > > > > > > > > > **************************************************************** > > > > > ** > > > > > ************* > > > > > > > > > > What could be the problem here? > > > > > > > > > > > > > Without configure.log we cannot tell what went wrong. However, > > > > from the error message, I would guess that your MPI was not built > > > > with Fortran bindings. You need these for those packages. > > > > > > > > Thanks, > > > > > > > > Matt > > > > > > > > > > > > > Your help is highly appreciated. > > > > > > > > > > Thank you > > > > > Ali > > > > > > > > > > -----Original Message----- > > > > > From: Satish Balay > > > > > Sent: Saturday, October 29, 2022 2:11 PM > > > > > To: Mohammad Ali Yaqteen > > > > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > > > > > > > > > > > I haven?t accessed PETSC or given any command of my own. I was > > > > > > just > > > > > installing by following the instructions. I don?t know why it is > > > > > attaching the debugger. Although it says ?Possible error running > > > > > C/C++ > > > > > src/snes/tutorials/ex19 with 1 MPI process? which I think is > > > > > indicating of missing of MPI! > > > > > > > > > > The diff is not smart enough to detect the extra message from > > > > > cygwin/OpenMPI - hence it assumes there is a potential problem - > > > > > and prints the above message. > > > > > > > > > > But you can assume its installed properly - and use it. > > > > > > > > > > Satish > > > > > > > > > > > > From: Matthew Knepley > > > > > > Sent: Friday, October 28, 2022 10:31 PM > > > > > > To: Mohammad Ali Yaqteen > > > > > > Cc: petsc-users at mcs.anl.gov > > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen < > > > > > mhyaqteen at sju.ac.kr> wrote: > > > > > > Dear Sir, > > > > > > > > > > > > During the Installation of PETSc in windows, I installed > > > > > > Cygwin and the > > > > > required libraries as mentioned on your website: > > > > > > [cid:image001.png at 01D8EB93.7C17E410] > > > > > > However, when I install PETSc using the configure commands > > > > > > present on > > > > > the petsc website: > > > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > > --download-f2cblaslapack --download-mpich > > > > > > > > > > > > it gives me the following error: > > > > > > > > > > > > [cid:image002.png at 01D8EB93.7C17E410] > > > > > > > > > > > > I already installed OpenMPI using Cygwin installer but it > > > > > > still asks me > > > > > to. When I configure without ??download-mpich? and run ?make check? > > > > > command, it gives me the following errors: > > > > > > > > > > > > [cid:image003.png at 01D8EB93.7C17E410] > > > > > > > > > > > > Could you kindly look into this and help me with this? Your > > > > > > prompt > > > > > response will highly be appreciated. > > > > > > > > > > > > The runs look fine. > > > > > > > > > > > > The test should not try to attach the debugger. Do you have > > > > > > that in the > > > > > PETSC_OPTIONS env variable? > > > > > > > > > > > > Thanks, > > > > > > > > > > > > Matt > > > > > > > > > > > > Thank you! > > > > > > Mohammad Ali > > > > > > Researcher, Sejong University > > > > > > > > > > > > > > > > > > -- > > > > > > What most experimenters take for granted before they begin > > > > > > their > > > > > experiments is infinitely more interesting than any results to > > > > > which their experiments lead. > > > > > > -- Norbert Wiener > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/< > > > > > http://www.cse.buffalo.edu/~knepley/> > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Nov 1 00:40:27 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 1 Nov 2022 00:40:27 -0500 (CDT) Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2db12320-25ab-7911-4bb6-ff0195f5ffdc@mcs.anl.gov> <461d2b54-173d-95fa-6ad5-9ce81849871e@mcs.anl.gov> Message-ID: <8c7b16a0-f933-92fe-f54a-337bcd88455a@mcs.anl.gov> > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory For some reason cygwin has broken dependencies here. Run cygwin setup and install the following pkgs. $ cygcheck.exe -f /usr/lib/libhwloc.dll.a /usr/lib/libevent_core.dll.a /usr/lib/libevent_pthreads.dll.a /usr/lib/libz.dll.a libevent-devel-2.1.12-1 libevent-devel-2.1.12-1 libhwloc-devel-2.6.0-2 zlib-devel-1.2.12-1 BTW: you can attach the file from PETSC_DIR/PETSC_ARCH/lib/petsc/conf/configure.log Satish On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > I am unable to attach the configure.log file. Hence. I have copied the following text after executing the command (less configure.log) in the cygwin64 > > Executing: uname -s > stdout: CYGWIN_NT-10.0-19044 > ============================================================================================= > Configuring PETSc to compile on your system > ============================================================================================= > > ================================================================================ > ================================================================================ > Starting configure run at Tue, 01 Nov 2022 13:06:06 +0900 > Configure Options: --configModules=PETSc.Configure --optionsModule=config.compilerOptions --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > Working directory: /home/SEJONG/petsc-3.18.1 > Machine platform: > uname_result(system='CYGWIN_NT-10.0-19044', node='DESKTOP-R1C768B', release='3.3.6-341.x86_64', version='2022-09-05 11:15 UTC', machine='x86_64') > Python version: > 3.9.10 (main, Jan 20 2022, 21:37:52) > [GCC 11.2.0] > ================================================================================ > Environmental variables > USERDOMAIN=DESKTOP-R1C768B > OS=Windows_NT > COMMONPROGRAMFILES=C:\Program Files\Common Files > PROCESSOR_LEVEL=6 > PSModulePath=C:\Users\SEJONG\Documents\WindowsPowerShell\Modules;C:\Program Files\WindowsPowerShell\Modules;C:\Windows\system32\WindowsPowerShell\v1.0\Modules > CommonProgramW6432=C:\Program Files\Common Files > CommonProgramFiles(x86)=C:\Program Files (x86)\Common Files > LANG=en_US.UTF-8 > TZ=Asia/Seoul > HOSTNAME=DESKTOP-R1C768B > PUBLIC=C:\Users\Public > OLDPWD=/home/SEJONG > USERNAME=SEJONG > LOGONSERVER=\\DESKTOP-R1C768B > PROCESSOR_ARCHITECTURE=AMD64 > LOCALAPPDATA=C:\Users\SEJONG\AppData\Local > COMPUTERNAME=DESKTOP-R1C768B > USER=SEJONG > !::=::\ > SYSTEMDRIVE=C: > USERPROFILE=C:\Users\SEJONG > PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.CPL > SYSTEMROOT=C:\Windows > USERDOMAIN_ROAMINGPROFILE=DESKTOP-R1C768B > OneDriveCommercial=C:\Users\SEJONG\OneDrive - Sejong University > PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 165 Stepping 5, GenuineIntel > GNUPLOT_LIB=C:\Program Files\gnuplot\demo;C:\Program Files\gnuplot\demo\games;C:\Program Files\gnuplot\share > PWD=/home/SEJONG/petsc-3.18.1 > MSMPI_BIN=C:\Program Files\Microsoft MPI\Bin\ > HOME=/home/SEJONG > TMP=/tmp > OneDrive=C:\Users\SEJONG\OneDrive - Sejong University > ZES_ENABLE_SYSMAN=1 > !C:=C:\cygwin64\bin > PROCESSOR_REVISION=a505 > PROFILEREAD=true > PROMPT=$P$G > NUMBER_OF_PROCESSORS=16 > ProgramW6432=C:\Program Files > COMSPEC=C:\Windows\system32\cmd.exe > APPDATA=C:\Users\SEJONG\AppData\Roaming > SHELL=/bin/bash > TERM=xterm-256color > WINDIR=C:\Windows > ProgramData=C:\ProgramData > SHLVL=1 > PRINTER=\\210.107.220.119\HP Color LaserJet Pro MFP M377 PCL 6 > PROGRAMFILES=C:\Program Files > ALLUSERSPROFILE=C:\ProgramData > TEMP=/tmp > DriverData=C:\Windows\System32\Drivers\DriverData > SESSIONNAME=Console > ProgramFiles(x86)=C:\Program Files (x86) > PATH=/usr/local/bin:/usr/bin:/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program Files/Microsoft MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools:/usr/li b/lapack > PS1=\[\e]0;\w\a\]\n\[\e[32m\]\u@\h \[\e[33m\]\w\[\e[0m\]\n\$ > HOMEDRIVE=C: > INFOPATH=/usr/local/info:/usr/share/info:/usr/info > HOMEPATH=\Users\SEJONG > ORIGINAL_PATH=/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program Files/Microsoft MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools > EXECIGNORE=*.dll > _=./configure > Files in path provided by default path > /usr/local/bin: > /usr/bin: addftinfo.exe addr2line.exe apropos ar.exe arch.exe as.exe ash.exe awk b2sum.exe base32.exe base64.exe basename.exe basenc.exe bash.exe bashbug bomtool.exe bunzip2.exe bzcat.exe bzcmp bzdiff bzegrep bzfgrep bzgrep bzip2.exe bzip2recover.exe bzless bzmore c++.exe c++filt.exe c89 c99 ca-legacy cal.exe captoinfo cat.exe catman.exe cc ccmake.exe chattr.exe chcon.exe chgrp.exe chmod.exe chown.exe chroot.exe chrt.exe cksum.exe clear.exe cmake.exe cmp.exe col.exe colcrt.exe colrm.exe column.exe comm.exe cp.exe cpack.exe cpp.exe csplit.exe ctest.exe cut.exe cygarchive-13.dll cygargp-0.dll cygatomic-1.dll cygattr-1.dll cygblkid-1.dll cygbrotlicommon-1.dll cygbrotlidec-1.dll cygbz2-1.dll cygcheck.exe cygcom_err-2.dll cygcrypt-2.dll cygcrypto-1.1.dll cygcurl-4.dll cygdb-5.3.dll cygdb_cxx-5.3.dll cygdb_sql-5.3.dll cygedit-0.dll cygevent-2-1-7.dll cygevent_core-2-1-7.dll cygevent_extra-2-1-7.dll cygevent_openssl-2-1-7.dll cygevent_pthreads-2-1-7.dll cygexpat-1.dll cygfdi sk-1.dll cygffi-6.dll cygfido2-1.dll cygformw-10.dll cyggc-1.dll cyggcc_s-seh-1.dll cyggdbm-6.dll cyggdbm_compat-4.dll cyggfortran-5.dll cyggmp-10.dll cyggomp-1.dll cyggsasl-7.dll cyggssapi_krb5-2.dll cygguile-2.2-1.dll cyghistory7.dll cyghwloc-15.dll cygiconv-2.dll cygidn-12.dll cygidn2-0.dll cygintl-8.dll cygisl-23.dll cygjsoncpp-25.dll cygk5crypto-3.dll cygkrb5-3.dll cygkrb5support-0.dll cyglber-2-4-2.dll cyglber-2.dll cygldap-2-4-2.dll cygldap-2.dll cygldap_r-2-4-2.dll cygltdl-7.dll cyglz4-1.dll cyglzma-5.dll cyglzo2-2.dll cygmagic-1.dll cygman-2-11-0.dll cygmandb-2-11-0.dll cygmenuw-10.dll cygmpc-3.dll cygmpfr-6.dll cygmpi-40.dll cygmpi_mpifh-40.dll cygmpi_usempif08-40.dll cygmpi_usempi_ignore_tkr-40.dll cygncursesw-10.dll cygnghttp2-14.dll cygntlm-0.dll cygopen-pal-40.dll cygopen-rte-40.dll cygp11-kit-0.dll cygpanelw-10.dll cygpath.exe cygpcre2-8-0.dll cygperl5_32.dll cygpipeline-1.dll cygpkgconf-4.dll cygpopt-0.dll cygpsl-5.dll cygquadmath-0.dll cygreadline7.dll cygrhash-0.dl l cygrun srv.exe cygsasl2-3.dll cygserver-config cygsigsegv-2.dll cygsmartcols-1.dll cygsqlite3-0.dll cygssh2-1.dll cygssl-1.1.dll cygstart.exe cygstdc++-6.dll cygtasn1-6.dll cygticw-10.dll cygunistring-2.dll cyguuid-1.dll cyguv-1.dll cygwin-console-helper.exe cygwin1.dll cygxml2-2.dll cygxxhash-0.dll cygz.dll cygzstd-1.dll dash.exe date.exe dd.exe df.exe diff.exe diff3.exe dir.exe dircolors.exe dirname.exe dlltool.exe dllwrap.exe dnsdomainname domainname du.exe dumper.exe echo.exe editrights.exe egrep elfedit.exe env.exe eqn.exe eqn2graph ex expand.exe expr.exe f95 factor.exe false.exe fgrep fido2-assert.exe fido2-cred.exe fido2-token.exe file.exe find.exe flock.exe fmt.exe fold.exe g++.exe gawk-5.1.1.exe gawk.exe gcc-ar.exe gcc-nm.exe gcc-ranlib.exe gcc.exe gcov-dump.exe gcov-tool.exe gcov.exe gdiffmk gencat.exe getconf.exe getent.exe getfacl.exe getopt.exe gfortran.exe git-receive-pack.exe git-shell.exe git-upload-archive.exe git-upload-pack.exe git.exe gkill.exe gmondump.exe gpro f.exe gr ap2graph grep.exe grn.exe grodvi.exe groff.exe grolbp.exe grolj4.exe grops.exe grotty.exe groups.exe gunzip gzexe gzip.exe head.exe hexdump.exe hostid.exe hostname.exe hpftodit.exe i686-w64-mingw32-pkg-config id.exe indxbib.exe info.exe infocmp.exe infotocap install-info.exe install.exe ipcmk.exe ipcrm.exe ipcs.exe isosize.exe join.exe kill.exe lastlog.exe ld.bfd.exe ld.exe ldd.exe ldh.exe less.exe lessecho.exe lesskey.exe lexgrog.exe libpython3.9.dll link-cygin.exe lkbib.exe ln.exe locale.exe locate.exe logger.exe login.exe logname.exe look.exe lookbib.exe ls.exe lsattr.exe lto-dump.exe lzcat lzcmp lzdiff lzegrep lzfgrep lzgrep lzless lzma lzmadec.exe lzmainfo.exe lzmore make-dummy-cert make.exe man-recode.exe man.exe mandb.exe manpath.exe mcookie.exe md5sum.exe minidumper.exe mintheme mintty.exe mkdir.exe mkfifo.exe mkgroup.exe mknod.exe mkpasswd.exe mkshortcut.exe mktemp.exe more.exe mount.exe mpic++ mpicc mpicxx mpiexec mpif77 mpif90 mpifort mpirun mv.exe namei.exe neqn nice.exe nl.exe nm.exe nohup.exe nproc.exe nroff numfmt.exe objcopy.exe objdump.exe od.exe ompi-clean ompi-server ompi_info.exe opal_wrapper.exe openssl.exe orte-clean.exe orte-info.exe orte-server.exe ortecc orted.exe orterun.exe p11-kit.exe passwd.exe paste.exe pathchk.exe pdfroff peflags.exe peflagsall perl.exe perl5.32.1.exe pfbtops.exe pg.exe pic.exe pic2graph pinky.exe pip3 pip3.9 pkg-config pkgconf.exe pldd.exe post-grohtml.exe pr.exe pre-grohtml.exe preconv.exe printenv.exe printf.exe profiler.exe ps.exe ptx.exe pwd.exe pydoc3 pydoc3.9 python python3 python3.9.exe pzstd.exe ranlib.exe readelf.exe readlink.exe readshortcut.exe realpath.exe rebase-trigger rebase.exe rebaseall rebaselst refer.exe regtool.exe rename.exe renew-dummy-cert renice.exe reset rev.exe rm.exe rmdir.exe rsync-ssl rsync.exe run.exe runcon.exe rvi rview scalar.exe scp.exe script.exe scriptreplay.exe sdiff.exe sed.exe seq.exe setfacl.exe setmetamode.exe setsid.exe sftp.exe sh.exe sha1sum.exe sha224sum.exe s ha256sum .exe sha384sum.exe sha512sum.exe shred.exe shuf.exe size.exe sleep.exe slogin soelim.exe sort.exe split.exe ssh-add.exe ssh-agent.exe ssh-copy-id ssh-host-config ssh-keygen.exe ssh-keyscan.exe ssh-user-config ssh.exe ssp.exe stat.exe stdbuf.exe strace.exe strings.exe strip.exe stty.exe sum.exe sync.exe tabs.exe tac.exe tail.exe tar.exe taskset.exe tbl.exe tee.exe test.exe tfmtodit.exe tic.exe timeout.exe toe.exe touch.exe tput.exe tr.exe troff.exe true.exe truncate.exe trust.exe tset.exe tsort.exe tty.exe tzselect tzset.exe ul.exe umount.exe uname.exe unexpand.exe uniq.exe unlink.exe unlzma unxz unzstd update-ca-trust update-crypto-policies updatedb users.exe uuidgen.exe uuidparse.exe vdir.exe vi.exe view wc.exe whatis.exe whereis.exe which.exe who.exe whoami.exe windmc.exe windres.exe x86_64-pc-cygwin-c++.exe x86_64-pc-cygwin-g++.exe x86_64-pc-cygwin-gcc-11.exe x86_64-pc-cygwin-gcc-ar.exe x86_64-pc-cygwin-gcc-nm.exe x86_64-pc-cygwin-gcc-ranlib.exe x86_64-pc-cygwin-gcc.exe x 86_64-pc -cygwin-gfortran.exe x86_64-pc-cygwin-pkg-config x86_64-w64-mingw32-pkg-config xargs.exe xmlcatalog.exe xmllint.exe xz.exe xzcat xzcmp xzdec.exe xzdiff xzegrep xzfgrep xzgrep xzless xzmore yes.exe zcat zcmp zdiff zdump.exe zegrep zfgrep zforce zgrep zless zmore znew zstd.exe zstdcat zstdgrep zstdless zstdmt [.exe > /cygdrive/c/SIMULIA/Commands: abaqus.bat abq2018.bat abq_cae_open.bat abq_odb_open.bat > /cygdrive/c/Program Files/Microsoft MPI/Bin: mpiexec.exe mpitrace.man smpd.exe > provthrd.dll provtool.exe ProximityCommon.dll ProximityCommonPal.dll ProximityRtapiPal.dll ProximityService.dll ProximityServicePal.dll ProximityToast ProximityUxHost.exe prproc.exe prvdmofcomp.dll psapi.dll pscript.sep PSHED.DLL psisdecd.dll psisrndr.ax PSModuleDis > coveryProvider.dll psmodulediscoveryprovider.mof PsmServiceExtHost.dll psmsrv.dll psr.exe pstask.dll pstorec.dll pt-BR pt-PT ptpprov.dll puiapi.dll puiobj.dll PushToInstall.dll pwlauncher.dll pwlauncher.exe pwrshplugin.dll pwsso.dll qappsrv.exe qasf.dll qcap.dll qdv. > dll qdvd.dll qedit.dll qedwipes.dll qmgr.dll qprocess.exe QualityUpdateAssistant.dll quartz.dll Query.dll query.exe QuickActionsDataModel.dll quickassist.exe QuietHours.dll quser.exe qwave.dll qwinsta.exe RacEngn.dll racpldlg.dll radardt.dll radarrs.dll RADCUI.dll ra > s rasadhlp.dll rasapi32.dll rasauto.dll rasautou.exe raschap.dll raschapext.dll rasctrnm.h rasctrs.dll rascustom.dll rasdiag.dll rasdial.exe rasdlg.dll raserver.exe rasgcw.dll rasman.dll rasmans.dll rasmbmgr.dll RasMediaManager.dll RASMM.dll rasmontr.dll rasphone.exe > rasplap.dll rasppp.dll rastapi.dll rastls.dll rastlsext.dll RasToast rdbui.dll rdpbase.dll rdpcfgex.dll rdpclip.exe rdpcore.dll rdpcorets.dll rdpcredentialprovider.dll rdpencom.dll rdpendp.dll rdpinit.exe rdpinput.exe rdpnano.dll RdpRelayTransport.dll RdpSa.exe RdpS > aProxy.exe RdpSaPs.dll RdpSaUacHelper.exe rdpserverbase.dll rdpsharercom.dll rdpshell.exe rdpsign.exe rdpudd.dll rdpviewerax.dll rdrleakdiag.exe RDSAppXHelper.dll rdsdwmdr.dll rdsxvmaudio.dll rdvvmtransport.dll RDXService.dll RDXTaskFactory.dll ReAgent.dll ReAgentc.e > xe ReAgentTask.dll recdisc.exe recover.exe Recovery recovery.dll RecoveryDrive.exe refsutil.exe reg.exe regapi.dll RegCtrl.dll regedt32.exe regidle.dll regini.exe Register-CimProvider.exe regsvc.dll regsvr32.exe reguwpapi.dll ReInfo.dll rekeywiz.exe relog.exe RelPost > .exe RemoteAppLifetimeManager.exe RemoteAppLifetimeManagerProxyStub.dll remoteaudioendpoint.dll remotepg.dll RemotePosWorker.exe remotesp.tsp RemoteSystemToastIcon.contrast-white.png RemoteSystemToastIcon.png RemoteWipeCSP.dll RemovableMediaProvisioningPlugin.dll Rem > oveDeviceContextHandler.dll RemoveDeviceElevated.dll rendezvousSession.tlb repair-bde.exe replace.exe ReportingCSP.dll RESAMPLEDMO.DLL ResBParser.dll reset.exe reseteng.dll ResetEngine.dll ResetEngine.exe ResetEngOnline.dll resmon.exe ResourceMapper.dll ResourcePolic > yClient.dll ResourcePolicyServer.dll ResPriHMImageList ResPriHMImageListLowCost ResPriImageList ResPriImageListLowCost RestartManager.mof RestartManagerUninstall.mof RestartNowPower_80.contrast-black.png RestartNowPower_80.contrast-white.png RestartNowPower_80.png Re > startTonight_80.png RestartTonight_80_contrast-black.png RestartTonight_80_contrast-white.png restore resutils.dll rgb9rast.dll Ribbons.scr riched20.dll riched32.dll rilproxy.dll RjvMDMConfig.dll RMActivate.exe RMActivate_isv.exe RMActivate_ssp.exe RMActivate_ssp_isv > .exe RMapi.dll rmclient.dll RmClient.exe RMSRoamingSecurity.dll rmttpmvscmgrsvr.exe rnr20.dll ro-RO RoamingSecurity.dll Robocopy.exe rometadata.dll RotMgr.dll ROUTE.EXE RpcEpMap.dll rpchttp.dll RpcNs4.dll rpcnsh.dll RpcPing.exe rpcrt4.dll RpcRtRemote.dll rpcss.dll rr > installer.exe rsaenh.dll rshx32.dll rsop.msc RstMwEventLogMsg.dll RstrtMgr.dll rstrui.exe RtCOM64.dll RtDataProc64.dll rtffilt.dll RtkApi64U.dll RtkAudUService64.exe RtkCfg64.dll rtm.dll rtmcodecs.dll RTMediaFrame.dll rtmmvrortc.dll rtmpal.dll rtmpltfm.dll rtutils.dl > l RTWorkQ.dll ru-RU RuleBasedDS.dll runas.exe rundll32.exe runexehelper.exe RunLegacyCPLElevated.exe runonce.exe RuntimeBroker.exe rwinsta.exe samcli.dll samlib.dll samsrv.dll Samsung sas.dll sbe.dll sbeio.dll sberes.dll sbservicetrigger.dll sc.exe ScanPlugin.dll sca > nsetting.dll SCardBi.dll SCardDlg.dll SCardSvr.dll ScavengeSpace.xml scavengeui.dll ScDeviceEnum.dll scecli.dll scesrv.dll schannel.dll schedcli.dll schedsvc.dll ScheduleTime_80.contrast-black.png ScheduleTime_80.contrast-white.png ScheduleTime_80.png schtasks.exe sc > ksp.dll scripto.dll ScriptRunner.exe scrnsave.scr scrobj.dll scrptadm.dll scrrun.dll sdbinst.exe sdchange.exe sdclt.exe sdcpl.dll SDDS.dll sdengin2.dll SDFHost.dll sdhcinst.dll sdiageng.dll sdiagnhost.exe sdiagprv.dll sdiagschd.dll sdohlp.dll sdrsvc.dll sdshext.dll S > earch.ProtocolHandler.MAPI2.dll SearchFilterHost.exe SearchFolder.dll SearchIndexer.exe SearchProtocolHost.exe SebBackgroundManagerPolicy.dll SecConfig.efi SecEdit.exe sechost.dll secinit.exe seclogon.dll secpol.msc secproc.dll secproc_isv.dll secproc_ssp.dll secproc > _ssp_isv.dll secur32.dll SecureAssessmentHandlers.dll SecureBootUpdates securekernel.exe SecureTimeAggregator.dll security.dll SecurityAndMaintenance.png SecurityAndMaintenance_Alert.png SecurityAndMaintenance_Error.png SecurityCenterBroker.dll SecurityCenterBrokerPS > .dll SecurityHealthAgent.dll SecurityHealthHost.exe SecurityHealthProxyStub.dll SecurityHealthService.exe SecurityHealthSSO.dll SecurityHealthSystray.exe sedplugins.dll SEMgrPS.dll SEMgrSvc.dll sendmail.dll Sens.dll SensApi.dll SensorDataService.exe SensorPerformance > Events.dll SensorsApi.dll SensorsClassExtension.dll SensorsCpl.dll SensorService.dll SensorsNativeApi.dll SensorsNativeApi.V2.dll SensorsUtilsV2.dll sensrsvc.dll serialui.dll services.exe services.msc ServicingUAPI.dll serwvdrv.dll SessEnv.dll sessionmsg.exe setbcdlo > cale.dll sethc.exe SetNetworkLocation.dll SetNetworkLocationFlyout.dll SetProxyCredential.dll setspn.exe SettingMonitor.dll settings.dat SettingsEnvironment.Desktop.dll SettingsExtensibilityHandlers.dll SettingsHandlers_Accessibility.dll SettingsHandlers_AnalogShell. > dll SettingsHandlers_AppControl.dll SettingsHandlers_AppExecutionAlias.dll SettingsHandlers_AssignedAccess.dll SettingsHandlers_Authentication.dll SettingsHandlers_BackgroundApps.dll SettingsHandlers_BatteryUsage.dll SettingsHandlers_BrowserDeclutter.dll SettingsHand > lers_CapabilityAccess.dll SettingsHandlers_Clipboard.dll SettingsHandlers_ClosedCaptioning.dll SettingsHandlers_ContentDeliveryManager.dll SettingsHandlers_Cortana.dll SettingsHandlers_Devices.dll SettingsHandlers_Display.dll SettingsHandlers_Flights.dll SettingsHand > lers_Fonts.dll SettingsHandlers_ForceSync.dll SettingsHandlers_Gaming.dll SettingsHandlers_Geolocation.dll SettingsHandlers_Gpu.dll SettingsHandlers_HoloLens_Environment.dll SettingsHandlers_IME.dll SettingsHandlers_InkingTypingPrivacy.dll SettingsHandlers_InputPerso > nalization.dll SettingsHandlers_Language.dll SettingsHandlers_ManagePhone.dll SettingsHandlers_Maps.dll SettingsHandlers_Mouse.dll SettingsHandlers_Notifications.dll SettingsHandlers_nt.dll SettingsHandlers_OneCore_BatterySaver.dll SettingsHandlers_OneCore_PowerAndSl > eep.dll SettingsHandlers_OneDriveBackup.dll SettingsHandlers_OptionalFeatures.dll SettingsHandlers_PCDisplay.dll SettingsHandlers_Pen.dll SettingsHandlers_QuickActions.dll SettingsHandlers_Region.dll SettingsHandlers_SharedExperiences_Rome.dll SettingsHandlers_SIUF.d > ll SettingsHandlers_SpeechPrivacy.dll SettingsHandlers_Startup.dll SettingsHandlers_StorageSense.dll SettingsHandlers_Troubleshoot.dll SettingsHandlers_User.dll SettingsHandlers_UserAccount.dll SettingsHandlers_UserExperience.dll SettingsHandlers_WorkAccess.dll Setti > ngSync.dll SettingSyncCore.dll SettingSyncDownloadHelper.dll SettingSyncHost.exe setup setupapi.dll setupcl.dll setupcl.exe setupcln.dll setupetw.dll setupugc.exe setx.exe sfc.dll sfc.exe sfc_os.dll Sgrm SgrmBroker.exe SgrmEnclave.dll SgrmEnclave_secure.dll SgrmLpac. > exe shacct.dll shacctprofile.dll SharedPCCSP.dll SharedRealitySvc.dll ShareHost.dll sharemediacpl.dll SHCore.dll shdocvw.dll shell32.dll ShellAppRuntime.exe ShellCommonCommonProxyStub.dll ShellExperiences shellstyle.dll shfolder.dll shgina.dll ShiftJIS.uce shimeng.dl > l shimgvw.dll shlwapi.dll shpafact.dll shrpubw.exe shsetup.dll shsvcs.dll shunimpl.dll shutdown.exe shutdownext.dll shutdownux.dll shwebsvc.dll si-lk signdrv.dll sigverif.exe SIHClient.exe sihost.exe SimAuth.dll SimCfg.dll simpdata.tlb sk-SK skci.dll sl-SI slc.dll sl > cext.dll SleepStudy SlideToShutDown.exe slmgr slmgr.vbs slui.exe slwga.dll SmallRoom.bin SmartCardBackgroundPolicy.dll SmartcardCredentialProvider.dll SmartCardSimulator.dll smartscreen.exe smartscreenps.dll SMBHelperClass.dll smbwmiv2.dll SMI SmiEngine.dll smphost.d > ll SmsRouterSvc.dll smss.exe SndVol.exe SndVolSSO.dll SnippingTool.exe snmpapi.dll snmptrap.exe Snooze_80.contrast-black.png Snooze_80.contrast-white.png Snooze_80.png socialapis.dll softkbd.dll softpub.dll sort.exe SortServer2003Compat.dll SortWindows61.dll SortWind > ows62.dll SortWindows64.dll SortWindows6Compat.dll SpaceAgent.exe spacebridge.dll SpaceControl.dll spaceman.exe SpatialAudioLicenseSrv.exe SpatializerApo.dll SpatialStore.dll spbcd.dll SpeakersSystemToastIcon.contrast-white.png SpeakersSystemToastIcon.png Spectrum.ex > e SpectrumSyncClient.dll Speech SpeechPal.dll Speech_OneCore spfileq.dll spinf.dll spmpm.dll spnet.dll spool spoolss.dll spoolsv.exe spopk.dll spp spp.dll sppc.dll sppcext.dll sppcomapi.dll sppcommdlg.dll SppExtComObj.Exe sppinst.dll sppnp.dll sppobjs.dll sppsvc.exe > sppui sppwinob.dll sppwmi.dll spwinsat.dll spwizeng.dll spwizimg.dll spwizres.dll spwmp.dll SqlServerSpatial130.dll SqlServerSpatial150.dll sqlsrv32.dll sqlsrv32.rll sqmapi.dll sr-Latn-RS srchadmin.dll srclient.dll srcore.dll srdelayed.exe SrEvents.dll SRH.dll srhelp > er.dll srm.dll srmclient.dll srmlib.dll srms-apr-v.dat srms-apr.dat srms.dat srmscan.dll srmshell.dll srmstormod.dll srmtrace.dll srm_ps.dll srpapi.dll SrpUxNativeSnapIn.dll srrstr.dll SrTasks.exe sru srumapi.dll srumsvc.dll srvcli.dll srvsvc.dll srwmi.dll sscore.dll > sscoreext.dll ssdm.dll ssdpapi.dll ssdpsrv.dll sspicli.dll sspisrv.dll SSShim.dll ssText3d.scr sstpsvc.dll StartTileData.dll Startupscan.dll StateRepository.Core.dll stclient.dll stdole2.tlb stdole32.tlb sti.dll sti_ci.dll stobject.dll StorageContextHandler.dll Stor > ageUsage.dll storagewmi.dll storagewmi_passthru.dll stordiag.exe storewuauth.dll Storprop.dll StorSvc.dll streamci.dll StringFeedbackEngine.dll StructuredQuery.dll SubRange.uce subst.exe sud.dll sv-SE SvBannerBackground.png svchost.exe svf.dll svsvc.dll SwitcherDataM > odel.dll swprv.dll sxproxy.dll sxs.dll sxshared.dll sxssrv.dll sxsstore.dll sxstrace.exe SyncAppvPublishingServer.exe SyncAppvPublishingServer.vbs SyncCenter.dll SyncController.dll SyncHost.exe SyncHostps.dll SyncInfrastructure.dll SyncInfrastructureps.dll SyncProxy. > dll Syncreg.dll SyncRes.dll SyncSettings.dll syncutil.dll sysclass.dll sysdm.cpl SysFxUI.dll sysmain.dll sysmon.ocx sysntfy.dll Sysprep sysprint.sep sysprtj.sep SysResetErr.exe syssetup.dll systemcpl.dll SystemEventsBrokerClient.dll SystemEventsBrokerServer.dll syste > minfo.exe SystemPropertiesAdvanced.exe SystemPropertiesComputerName.exe SystemPropertiesDataExecutionPrevention.exe SystemPropertiesHardware.exe SystemPropertiesPerformance.exe SystemPropertiesProtection.exe SystemPropertiesRemote.exe systemreset.exe SystemResetPlatf > orm SystemSettings.DataModel.dll SystemSettings.DeviceEncryptionHandlers.dll SystemSettings.Handlers.dll SystemSettings.SettingsExtensibility.dll SystemSettings.UserAccountsHandlers.dll SystemSettingsAdminFlows.exe SystemSettingsBroker.exe SystemSettingsRemoveDevice. > exe SystemSettingsThresholdAdminFlowUI.dll SystemSupportInfo.dll SystemUWPLauncher.exe systray.exe t2embed.dll ta-in ta-lk Tabbtn.dll TabbtnEx.dll tabcal.exe TabletPC.cpl TabSvc.dll takeown.exe tapi3.dll tapi32.dll tapilua.dll TapiMigPlugin.dll tapiperf.dll tapisrv.d > ll TapiSysprep.dll tapiui.dll TapiUnattend.exe tar.exe TaskApis.dll taskbarcpl.dll taskcomp.dll TaskFlowDataEngine.dll taskhostw.exe taskkill.exe tasklist.exe Taskmgr.exe Tasks taskschd.dll taskschd.msc TaskSchdPS.dll tbauth.dll tbs.dll tcblaunch.exe tcbloader.dll tc > msetup.exe tcpbidi.xml tcpipcfg.dll tcpmib.dll tcpmon.dll tcpmon.ini tcpmonui.dll TCPSVCS.EXE tdc.ocx tdh.dll TDLMigration.dll TEEManagement64.dll telephon.cpl TelephonyInteractiveUser.dll TelephonyInteractiveUserRes.dll tellib.dll TempSignedLicenseExchangeTask.dll T > enantRestrictionsPlugin.dll termmgr.dll termsrv.dll tetheringclient.dll tetheringconfigsp.dll TetheringIeProvider.dll TetheringMgr.dll tetheringservice.dll TetheringStation.dll TextInputFramework.dll TextInputMethodFormatter.dll TextShaping.dll th-TH themecpl.dll The > mes.SsfDownload.ScheduledTask.dll themeservice.dll themeui.dll ThirdPartyNoticesBySHS.txt threadpoolwinrt.dll thumbcache.dll ThumbnailExtractionHost.exe ti-et tier2punctuations.dll TieringEngineProxy.dll TieringEngineService.exe TileDataRepository.dll TimeBrokerClien > t.dll TimeBrokerServer.dll timedate.cpl TimeDateMUICallback.dll timeout.exe timesync.dll TimeSyncTask.dll TKCtrl2k64.sys TKFsAv64.sys TKFsFt64.sys TKFWFV.inf TKFWFV64.cat TKFWFV64.sys tkfwvt64.sys TKIdsVt64.sys TKPcFtCb64.sys TKPcFtCb64.sys_ TKPcFtHk64.sys TKRgAc2k64 > .sys TKRgFtXp64.sys TKTool2k.sys TKTool2k64.sys tlscsp.dll tokenbinding.dll TokenBroker.dll TokenBrokerCookies.exe TokenBrokerUI.dll tpm.msc TpmCertResources.dll tpmcompc.dll TpmCoreProvisioning.dll TpmInit.exe TpmTasks.dll TpmTool.exe tpmvsc.dll tpmvscmgr.exe tpmvsc > mgrsvr.exe tquery.dll tr-TR tracerpt.exe TRACERT.EXE traffic.dll TransformPPSToWlan.xslt TransformPPSToWlanCredentials.xslt TransliterationRanker.dll TransportDSA.dll tree.com trie.dll trkwks.dll TrustedSignalCredProv.dll tsbyuv.dll tscfgwmi.dll tscon.exe tsdiscon.ex > e TSErrRedir.dll tsf3gip.dll tsgqec.dll tskill.exe tsmf.dll TSpkg.dll tspubwmi.dll TSSessionUX.dll tssrvlic.dll TSTheme.exe TsUsbGDCoInstaller.dll TsUsbRedirectionGroupPolicyExtension.dll TSWbPrxy.exe TSWorkspace.dll TsWpfWrp.exe ttdinject.exe ttdloader.dll ttdplm.dl > l ttdrecord.dll ttdrecordcpu.dll TtlsAuth.dll TtlsCfg.dll TtlsExt.dll tttracer.exe tvratings.dll twext.dll twinapi.appcore.dll twinapi.dll twinui.appcore.dll twinui.dll twinui.pcshell.dll txflog.dll txfw32.dll typeperf.exe tzautoupdate.dll tzres.dll tzsync.exe tzsync > res.dll tzutil.exe ubpm.dll ucmhc.dll ucrtbase.dll ucrtbased.dll ucrtbase_clr0400.dll ucrtbase_enclave.dll ucsvc.exe udhisapi.dll uDWM.dll UefiCsp.dll UevAgentPolicyGenerator.exe UevAppMonitor.exe UevAppMonitor.exe.config UevCustomActionTypes.tlb UevTemplateBaselineG > enerator.exe UevTemplateConfigItemGenerator.exe uexfat.dll ufat.dll UiaManager.dll UIAnimation.dll UIAutomationCore.dll uicom.dll UIManagerBrokerps.dll UIMgrBroker.exe uireng.dll UIRibbon.dll UIRibbonRes.dll uk-UA ulib.dll umb.dll umdmxfrm.dll umpdc.dll umpnpmgr.dll > umpo-overrides.dll umpo.dll umpoext.dll umpowmi.dll umrdp.dll unattend.dll unenrollhook.dll unimdm.tsp unimdmat.dll uniplat.dll Unistore.dll unlodctr.exe UNP unregmp2.exe untfs.dll UpdateAgent.dll updatecsp.dll UpdateDeploymentProvider.dll UpdateHeartbeat.dll updatep > olicy.dll upfc.exe UpgradeResultsUI.exe upnp.dll upnpcont.exe upnphost.dll UPPrinterInstaller.exe UPPrinterInstallsCSP.dll upshared.dll uReFS.dll uReFSv1.dll ureg.dll url.dll urlmon.dll UsbCApi.dll usbceip.dll usbmon.dll usbperf.dll UsbPmApi.dll UsbSettingsHandlers.d > ll UsbTask.dll usbui.dll user32.dll UserAccountBroker.exe UserAccountControlSettings.dll UserAccountControlSettings.exe useractivitybroker.dll usercpl.dll UserDataAccessRes.dll UserDataAccountApis.dll UserDataLanguageUtil.dll UserDataPlatformHelperUtil.dll UserDataSe > rvice.dll UserDataTimeUtil.dll UserDataTypeHelperUtil.dll UserDeviceRegistration.dll UserDeviceRegistration.Ngc.dll userenv.dll userinit.exe userinitext.dll UserLanguageProfileCallback.dll usermgr.dll usermgrcli.dll UserMgrProxy.dll usk.rs usoapi.dll UsoClient.exe us > ocoreps.dll usocoreworker.exe usosvc.dll usp10.dll ustprov.dll UtcDecoderHost.exe UtcManaged.dll utcutil.dll utildll.dll Utilman.exe uudf.dll UvcModel.dll uwfcfgmgmt.dll uwfcsp.dll uwfservicingapi.dll UXInit.dll uxlib.dll uxlibres.dll uxtheme.dll vac.dll VAN.dll Vaul > t.dll VaultCDS.dll vaultcli.dll VaultCmd.exe VaultRoaming.dll vaultsvc.dll VBICodec.ax vbisurf.ax vbsapi.dll vbscript.dll vbssysprep.dll vcamp120.dll vcamp140.dll vcamp140d.dll VCardParser.dll vccorlib110.dll vccorlib120.dll vccorlib140.dll vccorlib140d.dll vcomp100. > dll vcomp110.dll vcomp120.dll vcomp140.dll vcomp140d.dll vcruntime140.dll vcruntime140d.dll vcruntime140_1.dll vcruntime140_1d.dll vcruntime140_clr0400.dll vds.exe vdsbas.dll vdsdyn.dll vdsldr.exe vdsutil.dll vdsvd.dll vds_ps.dll verclsid.exe verifier.dll verifier.ex > e verifiergui.exe version.dll vertdll.dll vfbasics.dll vfcompat.dll vfcuzz.dll vfluapriv.dll vfnet.dll vfntlmless.dll vfnws.dll vfprint.dll vfprintpthelper.dll vfrdvcompat.dll vfuprov.dll vfwwdm32.dll VhfUm.dll vid.dll vidcap.ax VideoHandlers.dll VIDRESZR.DLL virtdis > k.dll VirtualMonitorManager.dll VmApplicationHealthMonitorProxy.dll vmbuspipe.dll vmdevicehost.dll vmictimeprovider.dll vmrdvcore.dll VocabRoamingHandler.dll VoiceActivationManager.dll VoipRT.dll vpnike.dll vpnikeapi.dll VpnSohDesktop.dll VPNv2CSP.dll vrfcore.dll Vsc > MgrPS.dll vscover160.dll VSD3DWARPDebug.dll VsGraphicsCapture.dll VsGraphicsDesktopEngine.exe VsGraphicsExperiment.dll VsGraphicsHelper.dll VsGraphicsProxyStub.dll VsGraphicsRemoteEngine.exe vsjitdebugger.exe VSPerf160.dll vssadmin.exe vssapi.dll vsstrace.dll VSSVC.e > xe vss_ps.dll vulkan-1-999-0-0-0.dll vulkan-1.dll vulkaninfo-1-999-0-0-0.exe vulkaninfo.exe w32time.dll w32tm.exe w32topl.dll WaaSAssessment.dll WaaSMedicAgent.exe WaaSMedicCapsule.dll WaaSMedicPS.dll WaaSMedicSvc.dll WABSyncProvider.dll waitfor.exe WalletBackgroundS > erviceProxy.dll WalletProxy.dll WalletService.dll WallpaperHost.exe wavemsp.dll wbadmin.exe wbem wbemcomn.dll wbengine.exe wbiosrvc.dll wci.dll wcimage.dll wcmapi.dll wcmcsp.dll wcmsvc.dll WCN WcnApi.dll wcncsvc.dll WcnEapAuthProxy.dll WcnEapPeerProxy.dll WcnNetsh.dl > l wcnwiz.dll wc_storage.dll wdc.dll WDI wdi.dll wdigest.dll wdmaud.drv wdscore.dll WdsUnattendTemplate.xml WEB.rs webauthn.dll WebcamUi.dll webcheck.dll WebClnt.dll webio.dll webplatstorageserver.dll WebRuntimeManager.dll webservices.dll Websocket.dll wecapi.dll wecs > vc.dll wecutil.exe wephostsvc.dll wer.dll werconcpl.dll wercplsupport.dll werdiagcontroller.dll WerEnc.dll weretw.dll WerFault.exe WerFaultSecure.exe wermgr.exe wersvc.dll werui.dll wevtapi.dll wevtfwd.dll wevtsvc.dll wevtutil.exe wextract.exe WF.msc wfapigp.dll wfdp > rov.dll WFDSConMgr.dll WFDSConMgrSvc.dll WfHC.dll WFS.exe WFSR.dll whealogr.dll where.exe whhelper.dll whoami.exe wiaacmgr.exe wiaaut.dll wiadefui.dll wiadss.dll WiaExtensionHost64.dll wiarpc.dll wiascanprofiles.dll wiaservc.dll wiashext.dll wiatrace.dll wiawow64.exe > WiFiCloudStore.dll WiFiConfigSP.dll wifidatacapabilityhandler.dll WiFiDisplay.dll wifinetworkmanager.dll wifitask.exe WimBootCompress.ini wimgapi.dll wimserv.exe win32appinventorycsp.dll Win32AppSettingsProvider.dll Win32CompatibilityAppraiserCSP.dll win32k.sys win3 > 2kbase.sys win32kfull.sys win32kns.sys win32spl.dll win32u.dll Win32_DeviceGuard.dll winbio.dll WinBioDatabase WinBioDataModel.dll WinBioDataModelOOBE.exe winbioext.dll WinBioPlugIns winbrand.dll wincorlib.dll wincredprovider.dll wincredui.dll WindowManagement.dll Wi > ndowManagementAPI.dll Windows.AccountsControl.dll Windows.AI.MachineLearning.dll Windows.AI.MachineLearning.Preview.dll Windows.ApplicationModel.Background.SystemEventsBroker.dll Windows.ApplicationModel.Background.TimeBroker.dll Windows.ApplicationModel.Conversation > alAgent.dll windows.applicationmodel.conversationalagent.internal.proxystub.dll windows.applicationmodel.conversationalagent.proxystub.dll Windows.ApplicationModel.Core.dll windows.applicationmodel.datatransfer.dll Windows.ApplicationModel.dll Windows.ApplicationMode > l.LockScreen.dll Windows.ApplicationModel.Store.dll Windows.ApplicationModel.Store.Preview.DOSettings.dll Windows.ApplicationModel.Store.TestingFramework.dll Windows.ApplicationModel.Wallet.dll Windows.CloudStore.dll Windows.CloudStore.Schema.DesktopShell.dll Windows > .CloudStore.Schema.Shell.dll Windows.Cortana.Desktop.dll Windows.Cortana.OneCore.dll Windows.Cortana.ProxyStub.dll Windows.Data.Activities.dll Windows.Data.Pdf.dll Windows.Devices.AllJoyn.dll Windows.Devices.Background.dll Windows.Devices.Background.ps.dll Windows.De > vices.Bluetooth.dll Windows.Devices.Custom.dll Windows.Devices.Custom.ps.dll Windows.Devices.Enumeration.dll Windows.Devices.Haptics.dll Windows.Devices.HumanInterfaceDevice.dll Windows.Devices.Lights.dll Windows.Devices.LowLevel.dll Windows.Devices.Midi.dll Windows. > Devices.Perception.dll Windows.Devices.Picker.dll Windows.Devices.PointOfService.dll Windows.Devices.Portable.dll Windows.Devices.Printers.dll Windows.Devices.Printers.Extensions.dll Windows.Devices.Radios.dll Windows.Devices.Scanners.dll Windows.Devices.Sensors.dll > Windows.Devices.SerialCommunication.dll Windows.Devices.SmartCards.dll Windows.Devices.SmartCards.Phone.dll Windows.Devices.Usb.dll Windows.Devices.WiFi.dll Windows.Devices.WiFiDirect.dll Windows.Energy.dll Windows.FileExplorer.Common.dll Windows.Gaming.Input.dll Win > dows.Gaming.Preview.dll Windows.Gaming.UI.GameBar.dll Windows.Gaming.XboxLive.Storage.dll Windows.Globalization.dll Windows.Globalization.Fontgroups.dll Windows.Globalization.PhoneNumberFormatting.dll Windows.Graphics.Display.BrightnessOverride.dll Windows.Graphics.D > isplay.DisplayEnhancementOverride.dll Windows.Graphics.dll Windows.Graphics.Printing.3D.dll Windows.Graphics.Printing.dll Windows.Graphics.Printing.Workflow.dll Windows.Graphics.Printing.Workflow.Native.dll Windows.Help.Runtime.dll windows.immersiveshell.serviceprovi > der.dll Windows.Internal.AdaptiveCards.XamlCardRenderer.dll Windows.Internal.Bluetooth.dll Windows.Internal.CapturePicker.Desktop.dll Windows.Internal.CapturePicker.dll Windows.Internal.Devices.Sensors.dll Windows.Internal.Feedback.Analog.dll Windows.Internal.Feedbac > k.Analog.ProxyStub.dll Windows.Internal.Graphics.Display.DisplayColorManagement.dll Windows.Internal.Graphics.Display.DisplayEnhancementManagement.dll Windows.Internal.Management.dll Windows.Internal.Management.SecureAssessment.dll Windows.Internal.PlatformExtension. > DevicePickerExperience.dll Windows.Internal.PlatformExtension.MiracastBannerExperience.dll Windows.Internal.PredictionUnit.dll Windows.Internal.Security.Attestation.DeviceAttestation.dll Windows.Internal.SecurityMitigationsBroker.dll Windows.Internal.Shell.Broker.dll > windows.internal.shellcommon.AccountsControlExperience.dll windows.internal.shellcommon.AppResolverModal.dll Windows.Internal.ShellCommon.Broker.dll windows.internal.shellcommon.FilePickerExperienceMEM.dll Windows.Internal.ShellCommon.PrintExperience.dll windows.int > ernal.shellcommon.shareexperience.dll windows.internal.shellcommon.TokenBrokerModal.dll Windows.Internal.Signals.dll Windows.Internal.System.UserProfile.dll Windows.Internal.Taskbar.dll Windows.Internal.UI.BioEnrollment.ProxyStub.dll Windows.Internal.UI.Logon.ProxySt > ub.dll Windows.Internal.UI.Shell.WindowTabManager.dll Windows.Management.EnrollmentStatusTracking.ConfigProvider.dll Windows.Management.InprocObjects.dll Windows.Management.ModernDeployment.ConfigProviders.dll Windows.Management.Provisioning.ProxyStub.dll Windows.Man > agement.SecureAssessment.CfgProvider.dll Windows.Management.SecureAssessment.Diagnostics.dll Windows.Management.Service.dll Windows.Management.Workplace.dll Windows.Management.Workplace.WorkplaceSettings.dll Windows.Media.Audio.dll Windows.Media.BackgroundMediaPlayba > ck.dll Windows.Media.BackgroundPlayback.exe Windows.Media.Devices.dll Windows.Media.dll Windows.Media.Editing.dll Windows.Media.FaceAnalysis.dll Windows.Media.Import.dll Windows.Media.MediaControl.dll Windows.Media.MixedRealityCapture.dll Windows.Media.Ocr.dll Window > s.Media.Playback.BackgroundMediaPlayer.dll Windows.Media.Playback.MediaPlayer.dll Windows.Media.Playback.ProxyStub.dll Windows.Media.Protection.PlayReady.dll Windows.Media.Renewal.dll Windows.Media.Speech.dll Windows.Media.Speech.UXRes.dll Windows.Media.Streaming.dll > Windows.Media.Streaming.ps.dll Windows.Mirage.dll Windows.Mirage.Internal.Capture.Pipeline.ProxyStub.dll Windows.Mirage.Internal.dll Windows.Networking.BackgroundTransfer.BackgroundManagerPolicy.dll Windows.Networking.BackgroundTransfer.ContentPrefetchTask.dll Windo > ws.Networking.BackgroundTransfer.dll Windows.Networking.Connectivity.dll Windows.Networking.dll Windows.Networking.HostName.dll Windows.Networking.NetworkOperators.ESim.dll Windows.Networking.NetworkOperators.HotspotAuthentication.dll Windows.Networking.Proximity.dll > Windows.Networking.ServiceDiscovery.Dnssd.dll Windows.Networking.Sockets.PushEnabledApplication.dll Windows.Networking.UX.EapRequestHandler.dll Windows.Networking.Vpn.dll Windows.Networking.XboxLive.ProxyStub.dll Windows.Payments.dll Windows.Perception.Stub.dll Wind > ows.Security.Authentication.Identity.Provider.dll Windows.Security.Authentication.OnlineId.dll Windows.Security.Authentication.Web.Core.dll Windows.Security.Credentials.UI.CredentialPicker.dll Windows.Security.Credentials.UI.UserConsentVerifier.dll Windows.Security.I > ntegrity.dll Windows.Services.TargetedContent.dll Windows.SharedPC.AccountManager.dll Windows.SharedPC.CredentialProvider.dll Windows.Shell.BlueLightReduction.dll Windows.Shell.ServiceHostBuilder.dll Windows.Shell.StartLayoutPopulationEvents.dll Windows.StateReposito > ry.dll Windows.StateRepositoryBroker.dll Windows.StateRepositoryClient.dll Windows.StateRepositoryCore.dll Windows.StateRepositoryPS.dll Windows.StateRepositoryUpgrade.dll Windows.Storage.ApplicationData.dll Windows.Storage.Compression.dll windows.storage.dll Windows > .Storage.OneCore.dll Windows.Storage.Search.dll Windows.System.Diagnostics.dll Windows.System.Diagnostics.Telemetry.PlatformTelemetryClient.dll Windows.System.Diagnostics.TraceReporting.PlatformDiagnosticActions.dll Windows.System.Launcher.dll Windows.System.Profile. > HardwareId.dll Windows.System.Profile.PlatformDiagnosticsAndUsageDataSettings.dll Windows.System.Profile.RetailInfo.dll Windows.System.Profile.SystemId.dll Windows.System.Profile.SystemManufacturers.dll Windows.System.RemoteDesktop.dll Windows.System.SystemManagement > .dll Windows.System.UserDeviceAssociation.dll Windows.System.UserProfile.DiagnosticsSettings.dll Windows.UI.Accessibility.dll Windows.UI.AppDefaults.dll Windows.UI.BioFeedback.dll Windows.UI.BlockedShutdown.dll Windows.UI.Core.TextInput.dll Windows.UI.Cred.dll Window > s.UI.CredDialogController.dll Windows.UI.dll Windows.UI.FileExplorer.dll Windows.UI.Immersive.dll Windows.UI.Input.Inking.Analysis.dll Windows.UI.Input.Inking.dll Windows.UI.Internal.Input.ExpressiveInput.dll Windows.UI.Internal.Input.ExpressiveInput.Resource.dll Win > dows.UI.Logon.dll Windows.UI.NetworkUXController.dll Windows.UI.PicturePassword.dll Windows.UI.Search.dll Windows.UI.Shell.dll Windows.UI.Shell.Internal.AdaptiveCards.dll Windows.UI.Storage.dll Windows.UI.Xaml.Controls.dll Windows.UI.Xaml.dll Windows.UI.Xaml.InkContr > ols.dll Windows.UI.Xaml.Maps.dll Windows.UI.Xaml.Phone.dll Windows.UI.Xaml.Resources.19h1.dll Windows.UI.Xaml.Resources.Common.dll Windows.UI.Xaml.Resources.rs1.dll Windows.UI.Xaml.Resources.rs2.dll Windows.UI.Xaml.Resources.rs3.dll Windows.UI.Xaml.Resources.rs4.dll > Windows.UI.Xaml.Resources.rs5.dll Windows.UI.Xaml.Resources.th.dll Windows.UI.Xaml.Resources.win81.dll Windows.UI.Xaml.Resources.win8rtm.dll Windows.UI.XamlHost.dll Windows.WARP.JITService.dll Windows.WARP.JITService.exe Windows.Web.Diagnostics.dll Windows.Web.dll Wi > ndows.Web.Http.dll WindowsActionDialog.exe WindowsCodecs.dll WindowsCodecsExt.dll WindowsCodecsRaw.dll WindowsCodecsRaw.txt WindowsDefaultHeatProcessor.dll windowsdefenderapplicationguardcsp.dll WindowsInternal.ComposableShell.ComposerFramework.dll WindowsInternal.Co > mposableShell.DesktopHosting.dll WindowsInternal.Shell.CompUiActivation.dll WindowsIoTCsp.dll windowslivelogin.dll WindowsManagementServiceWinRt.ProxyStub.dll windowsperformancerecordercontrol.dll WindowsPowerShell WindowsSecurityIcon.png windowsudk.shellcommon.dll W > indowsUpdateElevatedInstaller.exe winethc.dll winevt WinFax.dll winhttp.dll winhttpcom.dll WinHvEmulation.dll WinHvPlatform.dll wininet.dll wininetlui.dll wininit.exe wininitext.dll winipcfile.dll winipcsecproc.dll winipsec.dll winjson.dll Winlangdb.dll winload.efi w > inload.exe winlogon.exe winlogonext.dll winmde.dll WinMetadata winml.dll winmm.dll winmmbase.dll winmsipc.dll WinMsoIrmProtector.dll winnlsres.dll winnsi.dll WinOpcIrmProtector.dll WinREAgent.dll winresume.efi winresume.exe winrm winrm.cmd winrm.vbs winrnr.dll winrs. > exe winrscmd.dll winrshost.exe winrsmgr.dll winrssrv.dll WinRTNetMUAHostServer.exe WinRtTracing.dll WinSAT.exe WinSATAPI.dll WinSCard.dll WinSetupUI.dll winshfhc.dll winsku.dll winsockhc.dll winspool.drv winsqlite3.dll WINSRPC.DLL winsrv.dll winsrvext.dll winsta.dll > WinSync.dll WinSyncMetastore.dll WinSyncProviders.dll wintrust.dll WinTypes.dll winusb.dll winver.exe WiredNetworkCSP.dll wisp.dll witnesswmiv2provider.dll wkscli.dll wkspbroker.exe wkspbrokerAx.dll wksprt.exe wksprtPS.dll wkssvc.dll wlanapi.dll wlancfg.dll WLanConn. > dll wlandlg.dll wlanext.exe wlangpui.dll WLanHC.dll wlanhlp.dll WlanMediaManager.dll WlanMM.dll wlanmsm.dll wlanpref.dll WlanRadioManager.dll wlansec.dll wlansvc.dll wlansvcpal.dll wlanui.dll wlanutil.dll Wldap32.dll wldp.dll wlgpclnt.dll wlidcli.dll wlidcredprov.dll > wlidfdp.dll wlidnsp.dll wlidprov.dll wlidres.dll wlidsvc.dll wlrmdr.exe WMADMOD.DLL WMADMOE.DLL WMALFXGFXDSP.dll WMASF.DLL wmcodecdspps.dll wmdmlog.dll wmdmps.dll wmdrmsdk.dll wmerror.dll wmi.dll wmiclnt.dll wmicmiplugin.dll wmidcom.dll wmidx.dll WmiMgmt.msc wmiprop > .dll wmitomi.dll WMNetMgr.dll wmp.dll WMPDMC.exe WmpDui.dll wmpdxm.dll wmpeffects.dll WMPhoto.dll wmploc.DLL wmpps.dll wmpshell.dll wmsgapi.dll WMSPDMOD.DLL WMSPDMOE.DLL WMVCORE.DLL WMVDECOD.DLL wmvdspa.dll WMVENCOD.DLL WMVSDECD.DLL WMVSENCD.DLL WMVXENCD.DLL WofTasks > .dll WofUtil.dll WordBreakers.dll WorkFolders.exe WorkfoldersControl.dll WorkFoldersGPExt.dll WorkFoldersRes.dll WorkFoldersShell.dll workfolderssvc.dll wosc.dll wow64.dll wow64cpu.dll wow64win.dll wowreg32.exe WpAXHolder.dll wpbcreds.dll Wpc.dll WpcApi.dll wpcatltoa > st.png WpcDesktopMonSvc.dll WpcMon.exe wpcmon.png WpcProxyStubs.dll WpcRefreshTask.dll WpcTok.exe WpcWebFilter.dll wpdbusenum.dll WpdMtp.dll WpdMtpUS.dll wpdshext.dll WPDShextAutoplay.exe WPDShServiceObj.dll WPDSp.dll wpd_ci.dll wpnapps.dll wpnclient.dll wpncore.dll > wpninprc.dll wpnpinst.exe wpnprv.dll wpnservice.dll wpnsruprov.dll WpnUserService.dll WpPortingLibrary.dll WppRecorderUM.dll wpr.config.xml wpr.exe WPTaskScheduler.dll wpx.dll write.exe ws2help.dll ws2_32.dll wscadminui.exe wscapi.dll wscinterop.dll wscisvif.dll WSCl > ient.dll WSCollect.exe wscproxystub.dll wscript.exe wscsvc.dll wscui.cpl WSDApi.dll wsdchngr.dll WSDPrintProxy.DLL WsdProviderUtil.dll WSDScanProxy.dll wsecedit.dll wsepno.dll wshbth.dll wshcon.dll wshelper.dll wshext.dll wshhyperv.dll wship6.dll wshom.ocx wshqos.dll > wshrm.dll WSHTCPIP.DLL wshunix.dll wsl.exe wslapi.dll WsmAgent.dll wsmanconfig_schema.xml WSManHTTPConfig.exe WSManMigrationPlugin.dll WsmAuto.dll wsmplpxy.dll wsmprovhost.exe WsmPty.xsl WsmRes.dll WsmSvc.dll WsmTxt.xsl WsmWmiPl.dll wsnmp32.dll wsock32.dll wsplib.dl > l wsp_fs.dll wsp_health.dll wsp_sr.dll wsqmcons.exe WSReset.exe WSTPager.ax wtsapi32.dll wuapi.dll wuapihost.exe wuauclt.exe wuaueng.dll wuceffects.dll WUDFCoinstaller.dll WUDFCompanionHost.exe WUDFHost.exe WUDFPlatform.dll WudfSMCClassExt.dll WUDFx.dll WUDFx02000.dl > l wudriver.dll wups.dll wups2.dll wusa.exe wuuhext.dll wuuhosdeployment.dll wvc.dll WwaApi.dll WwaExt.dll WWAHost.exe WWanAPI.dll wwancfg.dll wwanconn.dll WWanHC.dll wwanmm.dll Wwanpref.dll wwanprotdim.dll WwanRadioManager.dll wwansvc.dll wwapi.dll XamlTileRender.dll XAudio2_8.dll XAudio2_9.dll XblAuthManager.dll XblAuthManagerProxy.dll XblAuthTokenBrokerExt.dll XblGameSave.dll XblGameSaveExt.dll XblGameSaveProxy.dll XblGameSaveTask.exe XboxGipRadioManager.dll xboxgipsvc.dll xboxgipsynthetic.dll XboxNetApiSvc.dll xcopy.exe XInput1_4.dll XInput9_1_0.dll XInputUap.dll xmlfilter.dll xmllite.dll xmlprovi.dll xolehlp.dll XpsDocumentTargetPrint.dll XpsGdiConverter.dll XpsPrint.dll xpspushlayer.dll XpsRasterService.dll xpsservices.dll XpsToPclmConverter.dll XpsToPwgrConverter.dll xwizard.dtd xwizard.exe xwizards.dll xwreg.dll xwtpdui.dll xwtpw32.dll X_80.contrast-black.png X_80.contrast-white.png X_80.png ze_loader.dll ze_tracing_layer.dll ze_validation_layer.dll zh-CN zh-TW zipco ntainer. dll zipfldr.dll ztrace_maps.dll > /cygdrive/c/Windows: addins AhnInst.log appcompat Application Data apppatch AppReadiness assembly bcastdvr bfsvc.exe BitLockerDiscoveryVolumeContents Boot bootstat.dat Branding CbsTemp Containers CSC Cursors debug diagnostics DiagTrack DigitalLocker Downloaded > Program Files DtcInstall.log ELAMBKUP en-US explorer.exe Fonts GameBarPresenceWriter gethelp_audiotroubleshooter_latestpackage.zip Globalization Help HelpPane.exe hh.exe hipiw.dll IdentityCRL ImageSAFERSvc.exe IME IMGSF50Svc.exe ImmersiveControlPanel INF InputMethod > Installer ko-KR L2Schemas LanguageOverlayCache LiveKernelReports Logs lsasetup.log Media mib.bin Microsoft.NET Migration ModemLogs notepad.exe OCR Offline Web Pages Panther Performance PFRO.log PLA PolicyDefinitions Prefetch PrintDialog Professional.xml Provisioning > regedit.exe Registration RemotePackages rescache Resources RtlExUpd.dll SchCache schemas security ServiceProfiles ServiceState servicing Setup setupact.log setuperr.log ShellComponents ShellExperiences SHELLNEW SKB SoftwareDistribution Speech Speech_OneCore splwow64. > exe System system.ini System32 SystemApps SystemResources SystemTemp SysWOW64 TAPI Tasks Temp TempInst tracing twain_32 twain_32.dll Vss WaaS Web win.ini WindowsShell.Manifest WindowsUpdate.log winhlp32.exe WinSxS WMSysPr9.prx write.exe > /cygdrive/c/Windows/System32/Wbem: aeinv.mof AgentWmi.mof AgentWmiUninstall.mof appbackgroundtask.dll appbackgroundtask.mof appbackgroundtask_uninstall.mof AuditRsop.mof authfwcfg.mof AutoRecover bcd.mof BthMtpEnum.mof cimdmtf.mof cimwin32.dll cimwin32.mof CIWm > i.mof classlog.mof cli.mof cliegaliases.mof ddp.mof dimsjob.mof dimsroam.mof DMWmiBridgeProv.dll DMWmiBridgeProv.mof DMWmiBridgeProv1.dll DMWmiBridgeProv1.mof DMWmiBridgeProv1_Uninstall.mof DMWmiBridgeProv_Uninstall.mof dnsclientcim.dll dnsclientcim.mof dnsclientpspr > ovider.dll dnsclientpsprovider.mof dnsclientpsprovider_Uninstall.mof drvinst.mof DscCore.mof DscCoreConfProv.mof dscproxy.mof Dscpspluginwkr.dll DscTimer.mof dsprov.dll dsprov.mof eaimeapi.mof EmbeddedLockdownWmi.dll embeddedlockdownwmi.mof embeddedlockdownwmi_Uninst > all.mof en en-US esscli.dll EventTracingManagement.dll EventTracingManagement.mof fastprox.dll fdPHost.mof fdrespub.mof fdSSDP.mof fdWNet.mof fdWSD.mof filetrace.mof firewallapi.mof FolderRedirectionWMIProvider.mof FunDisc.mof fwcfg.mof hbaapi.mof hnetcfg.mof IMAPIv2 > -Base.mof IMAPIv2-FileSystemSupport.mof IMAPIv2-LegacyShim.mof interop.mof IpmiDTrc.mof ipmiprr.dll ipmiprv.dll ipmiprv.mof IpmiPTrc.mof ipsecsvc.mof iscsidsc.mof iscsihba.mof iscsiprf.mof iscsirem.mof iscsiwmiv2.mof iscsiwmiv2_uninstall.mof kerberos.mof ko ko-KR Krn > lProv.dll krnlprov.mof L2SecHC.mof lltdio.mof lltdsvc.mof Logs lsasrv.mof mblctr.mof MDMAppProv.dll MDMAppProv.mof MDMAppProv_Uninstall.mof MDMSettingsProv.dll MDMSettingsProv.mof MDMSettingsProv_Uninstall.mof Microsoft-Windows-OfflineFiles.mof Microsoft-Windows-Remo > te-FileSystem.mof Microsoft.AppV.AppVClientWmi.dll Microsoft.AppV.AppVClientWmi.mof Microsoft.Uev.AgentWmi.dll Microsoft.Uev.ManagedAgentWmi.mof Microsoft.Uev.ManagedAgentWmiUninstall.mof mispace.mof mispace_uninstall.mof mmc.mof MMFUtil.dll MOF mofcomp.exe mofd.dll > mofinstall.dll mountmgr.mof mpeval.mof mpsdrv.mof mpssvc.mof msdtcwmi.dll MsDtcWmi.mof msfeeds.mof msfeedsbs.mof msi.mof msiprov.dll msiscsi.mof MsNetImPlatform.mof mstsc.mof mstscax.mof msv1_0.mof mswmdm.mof NCProv.dll ncprov.mof ncsi.mof ndisimplatcim.dll ndistrace > .mof NetAdapterCim.dll NetAdapterCim.mof NetAdapterCimTrace.mof NetAdapterCimTraceUninstall.mof NetAdapterCim_uninstall.mof netdacim.dll netdacim.mof netdacim_uninstall.mof NetEventPacketCapture.dll NetEventPacketCapture.mof NetEventPacketCapture_uninstall.mof netncc > im.dll netnccim.mof netnccim_uninstall.mof NetPeerDistCim.dll NetPeerDistCim.mof NetPeerDistCim_uninstall.mof netprofm.mof NetSwitchTeam.mof netswitchteamcim.dll NetTCPIP.dll NetTCPIP.mof NetTCPIP_Uninstall.mof netttcim.dll netttcim.mof netttcim_uninstall.mof network > itemfactory.mof newdev.mof nlasvc.mof nlmcim.dll nlmcim.mof nlmcim_uninstall.mof nlsvc.mof npivwmi.mof nshipsec.mof ntevt.dll ntevt.mof ntfs.mof OfflineFilesConfigurationWmiProvider.mof OfflineFilesConfigurationWmiProvider_Uninstall.mof OfflineFilesWmiProvider.mof Of > flineFilesWmiProvider_Uninstall.mof p2p-mesh.mof p2p-pnrp.mof pcsvDevice.mof pcsvDevice_Uninstall.mof Performance PNPXAssoc.mof PolicMan.dll PolicMan.mof polproc.mof polprocl.mof polprou.mof polstore.mof portabledeviceapi.mof portabledeviceclassextension.mof portable > deviceconnectapi.mof portabledevicetypes.mof portabledevicewiacompat.mof powermeterprovider.mof PowerPolicyProvider.mof ppcRsopCompSchema.mof ppcRsopUserSchema.mof PrintFilterPipelineSvc.mof PrintManagementProvider.dll PrintManagementProvider.mof PrintManagementProvider_Uninstall.mof profileassociationprovider.mof PS_MMAgent.mof qmgr.mof qoswmi.dll qoswmi.mof qoswmitrc.mof qoswmitrc_uninstall.mof qoswmi_uninstall.mof RacWmiProv.dll RacWmiProv.mof rawxml.xsl rdpendp.mof rdpinit.mof rdpshell.mof refs.mof refsv1.mof regevent.mof Remove.Microsoft.AppV.AppvClientWmi.mof repdrvfs.dll Repository rsop.mof rspndr.mof samsrv.mof scersop.mof schannel.mof schedprov.dll SchedProv.mof scm.mof scrcons.exe scrcons.mof sdbus.mof secrcw32.mof SensorsClassExtension.mof ServDeps.dll ServiceModel.mof ServiceModel.mof.uninstall ServiceModel35.mof ServiceModel35.mof.uninstall services.mof setupapi.mof SmbWitnessWmiv2Provider.mof smbwmiv2.mof SMTPCons.dll smtpcons.mof sppwmi.mof sr.mof sstpsvc.mof stdprov .dll storagewmi.mof storagewmi_passthru.mof storagewmi_passthru_uninstall.mof storagewmi_uninstall.mof stortrace.mof subscrpt.mof system.mof tcpip.mof texttable.xsl textvaluelist.xsl tmf tsallow.mof tscfgwmi.mof tsmf.mof tspkg.mof umb.mof umbus.mof umpass.mof umpnpmgr.mof unsecapp.exe UserProfileConfigurationWmiProvider.mof UserProfileWmiProvider.mof UserStateWMIProvider.mof vds.mof vdswmi.dll viewprov.dll vpnclientpsprovider.dll vpnclientpsprovider.mof vpnclientpsprovider_Uninstall.mof vss.mof vsswmi.dll wbemcntl.dll wbemcons.dll WBEMCons.mof wbemcore.dll wbemdisp.dll wbemdisp.tlb wbemess.dll wbemprox.dll wbemsvc.dll wbemtest.exe wcncsvc.mof WdacEtwProv.mof WdacWmiProv.dll WdacWmiProv.mof WdacWmiProv_Uninstall.mof Wdf01000.mof Wdf01000Uninstall.mof wdigest.mof WFAPIGP.mof wfascim.dll wfascim.mof wfascim_uninstall.mof WFP.MOF wfs.mof whqlprov.mof Win32_DeviceGuard.mof Win32_EncryptableVolume.dll win32_encryptablevolume.mof Win32_EncryptableVolumeUninstall.mof win32_printer.m of Win32 _Tpm.dll Win32_Tpm.mof wininit.mof winipsec.mof winlogon.mof WinMgmt.exe WinMgmtR.dll Winsat.mof WinsatUninstall.mof wlan.mof WLanHC.mof wmi.mof WMIADAP.exe WmiApRes.dll WmiApRpl.dll WmiApSrv.exe WMIC.exe WMICOOKR.dll WmiDcPrv.dll wmipcima.dll wmipcima.mof wmipdfs.dll wmipdfs.mof wmipdskq.dll wmipdskq.mof WmiPerfClass.dll WmiPerfClass.mof WmiPerfInst.dll WmiPerfInst.mof WMIPICMP.dll wmipicmp.mof WMIPIPRT.dll wmipiprt.mof WMIPJOBJ.dll wmipjobj.mof wmiprov.dll WmiPrvSD.dll WmiPrvSE.exe WMIPSESS.dll wmipsess.mof WMIsvc.dll wmitimep.dll wmitimep.mof wmiutils.dll WMI_Tracing.mof wmp.mof wmpnetwk.mof wpdbusenum.mof wpdcomp.mof wpdfs.mof wpdmtp.mof wpdshext.mof WPDShServiceObj.mof wpdsp.mof wpd_ci.mof wscenter.mof WsmAgent.mof WsmAgentUninstall.mof WsmAuto.mof wsp_fs.mof wsp_fs_uninstall.mof wsp_health.mof wsp_health_uninstall.mof wsp_sr.mof wsp_sr_uninstall.mof WUDFx.mof Wudfx02000.mof Wudfx02000Uninstall.mof WUDFxUninstall.mof xml xsl-mappings.xml xwizards.mof > /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0: Certificate.format.ps1xml Diagnostics.Format.ps1xml DotNetTypes.format.ps1xml en en-US Event.Format.ps1xml Examples FileSystem.format.ps1xml getevent.types.ps1xml Help.format.ps1xml HelpV3.format.ps1xml ko ko-KR Modules powershell.exe powershell.exe.config PowerShellCore.format.ps1xml PowerShellTrace.format.ps1xml powershell_ise.exe powershell_ise.exe.config PSEvents.dll pspluginwkr.dll pwrshmsg.dll pwrshsip.dll Registry.format.ps1xml Schemas SessionConfig types.ps1xml typesv3.ps1xml WSMan.Format.ps1xml > /cygdrive/c/Windows/System32/OpenSSH: scp.exe sftp.exe ssh-add.exe ssh-agent.exe ssh-keygen.exe ssh-keyscan.exe ssh.exe > /cygdrive/c/Program Files/MATLAB/R2020b/bin: crash_analyzer.cfg icutzdata lcdata.xml lcdata.xsd lcdata_utf8.xml m3iregistry matlab.exe mex.bat mexext.bat util win32 win64 > /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn: Resources SqlLocalDB.exe > /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn: batchparser.dll bcp.exe Resources SQLCMD.EXE xmlrw.dll > /cygdrive/c/Program Files/Git/cmd: git-gui.exe git-lfs.exe git.exe gitk.exe start-ssh-agent.cmd start-ssh-pageant.cmd > Warning accessing /cygdrive/c/msys64/mingw64/bin gives errors: [Errno 2] No such file or directory: '/cygdrive/c/msys64/mingw64/bin' > Warning accessing /cygdrive/c/msys64/usr/bin gives errors: [Errno 2] No such file or directory: '/cygdrive/c/msys64/usr/bin' > /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config llvm-symbolizer.exe LocalESPC.dll Microsoft.Diagnostics.Tracing.EventSource.dll Microsoft.VisualStudio.RemoteControl.dll Microsoft.VisualStudio.Telemetry.dll Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll nmake.exe onecore perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll vctip.exe xdcmake.exe xdcmake.exe.config > /cygdrive/c/Program Files/dotnet: dotnet.exe host LICENSE.txt packs sdk shared templates ThirdPartyNotices.txt > /: bin Cygwin-Terminal.ico Cygwin.bat Cygwin.ico dev etc home lib mpich-4.0.2 mpich-4.0.2.tar.gz sbin tmp usr var proc cygdrive > /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps: Backup GameBarElevatedFT_Alias.exe Microsoft.DesktopAppInstaller_8wekyb3d8bbwe Microsoft.MicrosoftEdge_8wekyb3d8bbwe Microsoft.SkypeApp_kzf8qxf38zg5c Microsoft.XboxGamingOverlay_8wekyb3d8bbwe MicrosoftEdge.exe python.exe python3.exe Skype.exe winget.exe > /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin: code code.cmd > /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config llvm-symbolizer.exe LocalESPC.dll Microsoft.Diagnostics.Tracing.EventSource.dll Microsoft.VisualStudio.RemoteControl.dll Microsoft.VisualStudio.Telemetry.dll Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll nmake.exe onecore perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll vctip.exe xdcmake.exe xdcmake.exe.config > Warning accessing /cygdrive/c/Users/SEJONG/.dotnet/tools gives errors: [Errno 2] No such file or directory: '/cygdrive/c/Users/SEJONG/.dotnet/tools' > /usr/lib/lapack: cygblas-0.dll cyglapack-0.dll > ============================================================================================= > TESTING: configureExternalPackagesDir from config.framework(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py:1045) > Set alternative directory external packages are built in > serialEvaluation: initial cxxDialectRanges ('c++11', 'c++17') > serialEvaluation: new cxxDialectRanges ('c++11', 'c++17') > child config.utilities.macosFirewall took 0.000005 seconds > ============================================================================================= > TESTING: configureDebuggers from config.utilities.debuggers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/utilities/debuggers.py:20) > Find a default debugger and determine its arguments > Checking for program /usr/local/bin/gdb...not found > Checking for program /usr/bin/gdb...not found > Checking for program /cygdrive/c/SIMULIA/Commands/gdb...not found > Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/gdb...not found > Checking for program /cygdrive/c/Windows/system32/gdb...not found > Checking for program /cygdrive/c/Windows/gdb...not found > Checking for program /cygdrive/c/Windows/System32/Wbem/gdb...not found > Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/gdb...not found > Checking for program /cygdrive/c/Windows/System32/OpenSSH/gdb...not found > Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/gdb...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/gdb...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/gdb...not found > Checking for program /cygdrive/c/Program Files/Git/cmd/gdb...not found > Checking for program /cygdrive/c/msys64/mingw64/bin/gdb...not found > Checking for program /cygdrive/c/msys64/usr/bin/gdb...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not found > Checking for program /cygdrive/c/Program Files/dotnet/gdb...not found > Checking for program /gdb...not found > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/gdb...not found > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/gdb...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not found > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/gdb...not found > Checking for program /usr/lib/lapack/gdb...not found > Checking for program /usr/local/bin/dbx...not found > Checking for program /usr/bin/dbx...not found > Checking for program /cygdrive/c/SIMULIA/Commands/dbx...not found > Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/dbx...not found > Checking for program /cygdrive/c/Windows/system32/dbx...not found > Checking for program /cygdrive/c/Windows/dbx...not found > Checking for program /cygdrive/c/Windows/System32/Wbem/dbx...not found > Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/dbx...not found > Checking for program /cygdrive/c/Windows/System32/OpenSSH/dbx...not found > Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/dbx...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/dbx...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/dbx...not found > Checking for program /cygdrive/c/Program Files/Git/cmd/dbx...not found > Checking for program /cygdrive/c/msys64/mingw64/bin/dbx...not found > Checking for program /cygdrive/c/msys64/usr/bin/dbx...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not found > Checking for program /cygdrive/c/Program Files/dotnet/dbx...not found > Checking for program /dbx...not found > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/dbx...not found > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/dbx...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not found > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/dbx...not found > Checking for program /usr/lib/lapack/dbx...not found > Defined make macro "DSYMUTIL" to "true" > child config.utilities.debuggers took 0.014310 seconds > ============================================================================================= > TESTING: configureDirectories from PETSc.options.petscdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscdir.py:22) > Checks PETSC_DIR and sets if not set > PETSC_VERSION_RELEASE of 1 indicates the code is from a release branch or a branch created from a release branch. > Version Information: > #define PETSC_VERSION_RELEASE 1 > #define PETSC_VERSION_MAJOR 3 > #define PETSC_VERSION_MINOR 18 > #define PETSC_VERSION_SUBMINOR 1 > #define PETSC_VERSION_DATE "Oct 26, 2022" > #define PETSC_VERSION_GIT "v3.18.1" > #define PETSC_VERSION_DATE_GIT "2022-10-26 07:57:29 -0500" > #define PETSC_VERSION_EQ(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_ PETSC_VERSION_EQ > #define PETSC_VERSION_LT(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_LE(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_GT(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_GE(MAJOR,MINOR,SUBMINOR) \ > child PETSc.options.petscdir took 0.015510 seconds > ============================================================================================= > TESTING: getDatafilespath from PETSc.options.dataFilesPath(/home/SEJONG/petsc-3.18.1/config/PETSc/options/dataFilesPath.py:29) > Checks what DATAFILESPATH should be > child PETSc.options.dataFilesPath took 0.002462 seconds > ============================================================================================= > TESTING: configureGit from config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:24) > Find the Git executable > Checking for program /usr/local/bin/git...not found > Checking for program /usr/bin/git...found > Defined make macro "GIT" to "git" > Executing: git --version > stdout: git version 2.38.1 > ============================================================================================= > TESTING: configureMercurial from config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:35) > Find the Mercurial executable > Checking for program /usr/local/bin/hg...not found > Checking for program /usr/bin/hg...not found > Checking for program /cygdrive/c/SIMULIA/Commands/hg...not found > Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/hg...not found > Checking for program /cygdrive/c/Windows/system32/hg...not found > Checking for program /cygdrive/c/Windows/hg...not found > Checking for program /cygdrive/c/Windows/System32/Wbem/hg...not found > Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/hg...not found > Checking for program /cygdrive/c/Windows/System32/OpenSSH/hg...not found > Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/hg...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/hg...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/hg...not found > Checking for program /cygdrive/c/Program Files/Git/cmd/hg...not found > Checking for program /cygdrive/c/msys64/mingw64/bin/hg...not found > Checking for program /cygdrive/c/msys64/usr/bin/hg...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not found > Checking for program /cygdrive/c/Program Files/dotnet/hg...not found > Checking for program /hg...not found > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/hg...not found > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/hg...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not found > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/hg...not found > Checking for program /usr/lib/lapack/hg...not found > Checking for program /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/hg...not found > child config.sourceControl took 0.121914 seconds > ============================================================================================= > TESTING: configureInstallationMethod from PETSc.options.petscclone(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscclone.py:20) > Determine if PETSc was obtained via git or a tarball > This is a tarball installation > child PETSc.options.petscclone took 0.003125 seconds > ============================================================================================= > TESTING: setNativeArchitecture from PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:29) > Forms the arch as GNU's configure would form it > ============================================================================================= > TESTING: configureArchitecture from PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:42) > Checks if PETSC_ARCH is set and sets it if not set > No previous hashfile found > Setting hashfile: arch-mswin-c-debug/lib/petsc/conf/configure-hash > Deleting configure hash file: arch-mswin-c-debug/lib/petsc/conf/configure-hash > Unable to delete configure hash file: arch-mswin-c-debug/lib/petsc/conf/configure-hash > child PETSc.options.arch took 0.149094 seconds > ============================================================================================= > TESTING: setInstallDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:31) > Set installDir to either prefix or if that is not set to PETSC_DIR/PETSC_ARCH > Defined make macro "PREFIXDIR" to "/home/SEJONG/petsc-3.18.1/arch-mswin-c-debug" > ============================================================================================= > TESTING: saveReconfigure from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:76) > Save the configure options in a script in PETSC_ARCH/lib/petsc/conf so the same configure may be easily re-run > ============================================================================================= > TESTING: cleanConfDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:68) > Remove all the files from configuration directory for this PETSC_ARCH, from --with-clean option > ============================================================================================= > TESTING: configureInstallDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:52) > Makes installDir subdirectories if it does not exist for both prefix install location and PETSc work install location > Changed persistence directory to /home/SEJONG/petsc-3.18.1/arch-mswin-c-debug/lib/petsc/conf > > TESTING: restoreReconfigure from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:90) > If --with-clean was requested but restoring the reconfigure file was requested then restore it > child PETSc.options.installDir took 0.006476 seconds > ============================================================================================= > TESTING: setExternalPackagesDir from PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:15) > Set location where external packages will be downloaded to > ============================================================================================= > TESTING: cleanExternalpackagesDir from PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:23) > Remove all downloaded external packages, from --with-clean > child PETSc.options.externalpackagesdir took 0.000990 seconds > ============================================================================================= > TESTING: configureCLanguage from PETSc.options.languages(/home/SEJONG/petsc-3.18.1/config/PETSc/options/languages.py:28) > Choose whether to compile the PETSc library using a C or C++ compiler > C language is C > Defined "CLANGUAGE_C" to "1" > Defined make macro "CLANGUAGE" to "C" > child PETSc.options.languages took 0.003172 seconds > ============================================================================================= > TESTING: resetEnvCompilers from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2652) > Remove compilers from the shell environment so they do not interfer with testing > ============================================================================================= > TESTING: checkEnvCompilers from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2669) > Set configure compilers from the environment, from -with-environment-variables > ============================================================================================= > TESTING: checkMPICompilerOverride from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2622) > Check if --with-mpi-dir is used along with CC CXX or FC compiler options. > This usually prevents mpi compilers from being used - so issue a warning > ============================================================================================= > TESTING: requireMpiLdPath from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2643) > OpenMPI wrappers require LD_LIBRARY_PATH set > ============================================================================================= > TESTING: checkInitialFlags from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:723) > Initialize the compiler and linker flags > Initialized CFLAGS to > Initialized CFLAGS to > Initialized LDFLAGS to > Initialized CUDAFLAGS to > Initialized CUDAFLAGS to > Initialized LDFLAGS to > Initialized HIPFLAGS to > Initialized HIPFLAGS to > Initialized LDFLAGS to > Initialized SYCLFLAGS to > Initialized SYCLFLAGS to > Initialized LDFLAGS to > Initialized CXXFLAGS to > Initialized CXX_CXXFLAGS to > Initialized LDFLAGS to > Initialized FFLAGS to > Initialized FFLAGS to > Initialized LDFLAGS to > Initialized CPPFLAGS to > Initialized FPPFLAGS to > Initialized CUDAPPFLAGS to -Wno-deprecated-gpu-targets > Initialized CXXPPFLAGS to > Initialized HIPPPFLAGS to > Initialized SYCLPPFLAGS to > Initialized CC_LINKER_FLAGS to [] > Initialized CXX_LINKER_FLAGS to [] > Initialized FC_LINKER_FLAGS to [] > Initialized CUDAC_LINKER_FLAGS to [] > Initialized HIPC_LINKER_FLAGS to [] > Initialized SYCLC_LINKER_FLAGS to [] > > TESTING: checkCCompiler from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:1341) > Locate a functional C compiler > Checking for program /usr/local/bin/mpicc...not found > Checking for program /usr/bin/mpicc...found > Defined make macro "CC" to "mpicc" > Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o -I/tmp/petsc-uqt11yqc/config.setCompilers /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c > Successful compile: > Source: > #include "confdefs.h" > #include "conffix.h" > > int main() { > ; > return 0; > } > > Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o -I/tmp/petsc-uqt11yqc/config.setCompilers /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c > Successful compile: > Source: > #include "confdefs.h" > #include "conffix.h" > > int main() { > ; > return 0; > } > > Executing: mpicc -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.exe /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o > Possible ERROR while running linker: exit code 1 > stderr: > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory > collect2: error: ld returned 1 exit status > Linker output before filtering: > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory > collect2: error: ld returned 1 exit status > : > Linker output after filtering: > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory > collect2: error: ld returned 1 exit status: > Error testing C compiler: Cannot compile/link C with mpicc. > MPI compiler wrapper mpicc failed to compile > Executing: mpicc -show > stdout: gcc -L/usr/lib -lmpi -lopen-rte -lopen-pal -lhwloc -levent_core -levent_pthreads -lz > MPI compiler wrapper mpicc is likely incorrect. > Use --with-mpi-dir to indicate an alternate MPI. > Deleting "CC" > ******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > ------------------------------------------------------------------------------- > C compiler you provided with -with-cc=mpicc cannot be found or does not work. > Cannot compile/link C with mpicc. > ******************************************************************************* > File "/home/SEJONG/petsc-3.18.1/config/configure.py", line 461, in petsc_configure > framework.configure(out = sys.stdout) > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1412, in configure > self.processChildren() > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1400, in processChildren > self.serialEvaluation(self.childGraph) > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1375, in serialEvaluation > child.configure() > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 2712, in configure > self.executeTest(self.checkCCompiler) > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/base.py", line 138, in executeTest > ret = test(*args,**kargs) > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 1346, in checkCCompiler > for compiler in self.generateCCompilerGuesses(): > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 1274, in generateCCompilerGuesses > raise RuntimeError('C compiler you provided with -with-cc='+self.argDB['with-cc']+' cannot be found or does not work.'+'\n'+self.mesg) > ================================================================================ > Finishing configure run at Tue, 01 Nov 2022 13:06:09 +0900 > > -----Original Message----- > From: Satish Balay > Sent: Tuesday, November 1, 2022 11:36 AM > To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: RE: [petsc-users] PETSc Windows Installation > > you'll have to send configure.log for this failure > > Satish > > > On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > > > I have checked the required Cygwin openmpi libraries and they are all installed. When I run ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90, it returns: > > > > $ ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > > ============================================================================================= > > Configuring PETSc to compile on your system > > ====================================================================== > > ======================= > > TESTING: checkCCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > > ---------------------------------------------------------------------- > > --------- C compiler you provided with -with-cc=mpicc cannot be found > > or does not work. > > Cannot compile/link C with mpicc. > > > > As for the case of WSL2, I will try to install that on my PC. > > Meanwhile, could you please look into this issue > > > > Thank you > > > > Ali > > > > -----Original Message----- > > From: Satish Balay > > Sent: Monday, October 31, 2022 10:56 PM > > To: Satish Balay via petsc-users > > Cc: Matthew Knepley ; Mohammad Ali Yaqteen > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > BTW: If you have WSL2 on windows - it might be easier to build/use PETSc. > > > > Satish > > > > On Mon, 31 Oct 2022, Satish Balay via petsc-users wrote: > > > > > Make sure you have cygwin openmpi installed [and cywin blas/lapack] > > > > > > $ cygcheck -cd |grep openmpi > > > libopenmpi-devel 4.1.2-1 > > > libopenmpi40 4.1.2-1 > > > libopenmpifh40 4.1.2-1 > > > libopenmpiusef08_40 4.1.2-1 > > > libopenmpiusetkr40 4.1.2-1 > > > openmpi 4.1.2-1 > > > $ cygcheck -cd |grep lapack > > > liblapack-devel 3.10.1-1 > > > liblapack0 3.10.1-1 > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > --download-f2cblaslapack > > > > > > Should be: > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > i.e [default] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 [an > > > default cygwin blas/lapack] > > > > > > Satish > > > > > > > > > On Mon, 31 Oct 2022, Matthew Knepley wrote: > > > > > > > On Mon, Oct 31, 2022 at 1:56 AM Mohammad Ali Yaqteen > > > > > > > > wrote: > > > > > > > > > Dear Satish > > > > > > > > > > When I configure PETSc with (./configure --with-cc=gcc > > > > > --with-cxx=0 > > > > > --with-fc=0 --download-f2cblaslapack) it runs as I shared > > > > > initially which you said is not an issue anymore. But when I add > > > > > (--download-scalapack > > > > > --download-mumps) or configure with these later, it gives the > > > > > following > > > > > error: > > > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > > > > > ============================================================================================= > > > > > Configuring PETSc to compile on your > > > > > system > > > > > > > > > > ================================================================ > > > > > == > > > > > =========================== > > > > > TESTING: FortranMPICheck from > > > > > config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* > > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > > > > details): > > > > > > > > > > ---------------------------------------------------------------- > > > > > -- > > > > > ------------- Fortran error! mpi_init() could not be located! > > > > > > > > > > **************************************************************** > > > > > ** > > > > > ************* > > > > > > > > > > What could be the problem here? > > > > > > > > > > > > > Without configure.log we cannot tell what went wrong. However, > > > > from the error message, I would guess that your MPI was not built > > > > with Fortran bindings. You need these for those packages. > > > > > > > > Thanks, > > > > > > > > Matt > > > > > > > > > > > > > Your help is highly appreciated. > > > > > > > > > > Thank you > > > > > Ali > > > > > > > > > > -----Original Message----- > > > > > From: Satish Balay > > > > > Sent: Saturday, October 29, 2022 2:11 PM > > > > > To: Mohammad Ali Yaqteen > > > > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > > > > > > > > > > > I haven?t accessed PETSC or given any command of my own. I was > > > > > > just > > > > > installing by following the instructions. I don?t know why it is > > > > > attaching the debugger. Although it says ?Possible error running > > > > > C/C++ > > > > > src/snes/tutorials/ex19 with 1 MPI process? which I think is > > > > > indicating of missing of MPI! > > > > > > > > > > The diff is not smart enough to detect the extra message from > > > > > cygwin/OpenMPI - hence it assumes there is a potential problem - > > > > > and prints the above message. > > > > > > > > > > But you can assume its installed properly - and use it. > > > > > > > > > > Satish > > > > > > > > > > > > From: Matthew Knepley > > > > > > Sent: Friday, October 28, 2022 10:31 PM > > > > > > To: Mohammad Ali Yaqteen > > > > > > Cc: petsc-users at mcs.anl.gov > > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen < > > > > > mhyaqteen at sju.ac.kr> wrote: > > > > > > Dear Sir, > > > > > > > > > > > > During the Installation of PETSc in windows, I installed > > > > > > Cygwin and the > > > > > required libraries as mentioned on your website: > > > > > > [cid:image001.png at 01D8EB93.7C17E410] > > > > > > However, when I install PETSc using the configure commands > > > > > > present on > > > > > the petsc website: > > > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > > --download-f2cblaslapack --download-mpich > > > > > > > > > > > > it gives me the following error: > > > > > > > > > > > > [cid:image002.png at 01D8EB93.7C17E410] > > > > > > > > > > > > I already installed OpenMPI using Cygwin installer but it > > > > > > still asks me > > > > > to. When I configure without ??download-mpich? and run ?make check? > > > > > command, it gives me the following errors: > > > > > > > > > > > > [cid:image003.png at 01D8EB93.7C17E410] > > > > > > > > > > > > Could you kindly look into this and help me with this? Your > > > > > > prompt > > > > > response will highly be appreciated. > > > > > > > > > > > > The runs look fine. > > > > > > > > > > > > The test should not try to attach the debugger. Do you have > > > > > > that in the > > > > > PETSC_OPTIONS env variable? > > > > > > > > > > > > Thanks, > > > > > > > > > > > > Matt > > > > > > > > > > > > Thank you! > > > > > > Mohammad Ali > > > > > > Researcher, Sejong University > > > > > > > > > > > > > > > > > > -- > > > > > > What most experimenters take for granted before they begin > > > > > > their > > > > > experiments is infinitely more interesting than any results to > > > > > which their experiments lead. > > > > > > -- Norbert Wiener > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/< > > > > > http://www.cse.buffalo.edu/~knepley/> > > > > > > > > > > > > > > > > > > > > > > > > > > > > > From mhyaqteen at sju.ac.kr Tue Nov 1 09:37:14 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Tue, 1 Nov 2022 14:37:14 +0000 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: <8c7b16a0-f933-92fe-f54a-337bcd88455a@mcs.anl.gov> References: <2db12320-25ab-7911-4bb6-ff0195f5ffdc@mcs.anl.gov> <461d2b54-173d-95fa-6ad5-9ce81849871e@mcs.anl.gov> <8c7b16a0-f933-92fe-f54a-337bcd88455a@mcs.anl.gov> Message-ID: The above commands worked but I get an error message when I include petsc.h in Visual Studio. The error message is "Cannot open include file: 'petscconf.h': No such file or directory Thanks, Ali -----Original Message----- From: Satish Balay Sent: Tuesday, November 1, 2022 2:40 PM To: Mohammad Ali Yaqteen Cc: petsc-users Subject: Re: [petsc-users] PETSc Windows Installation > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory For some reason cygwin has broken dependencies here. Run cygwin setup and install the following pkgs. $ cygcheck.exe -f /usr/lib/libhwloc.dll.a /usr/lib/libevent_core.dll.a /usr/lib/libevent_pthreads.dll.a /usr/lib/libz.dll.a libevent-devel-2.1.12-1 libevent-devel-2.1.12-1 libhwloc-devel-2.6.0-2 zlib-devel-1.2.12-1 BTW: you can attach the file from PETSC_DIR/PETSC_ARCH/lib/petsc/conf/configure.log Satish On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > I am unable to attach the configure.log file. Hence. I have copied the following text after executing the command (less configure.log) in the cygwin64 > > Executing: uname -s > stdout: CYGWIN_NT-10.0-19044 > ============================================================================================= > Configuring PETSc to compile on your system > ============================================================================================= > > ================================================================================ > ================================================================================ > Starting configure run at Tue, 01 Nov 2022 13:06:06 +0900 > Configure Options: --configModules=PETSc.Configure --optionsModule=config.compilerOptions --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > Working directory: /home/SEJONG/petsc-3.18.1 > Machine platform: > uname_result(system='CYGWIN_NT-10.0-19044', node='DESKTOP-R1C768B', release='3.3.6-341.x86_64', version='2022-09-05 11:15 UTC', machine='x86_64') > Python version: > 3.9.10 (main, Jan 20 2022, 21:37:52) > [GCC 11.2.0] > ================================================================================ > Environmental variables > USERDOMAIN=DESKTOP-R1C768B > OS=Windows_NT > COMMONPROGRAMFILES=C:\Program Files\Common Files > PROCESSOR_LEVEL=6 > PSModulePath=C:\Users\SEJONG\Documents\WindowsPowerShell\Modules;C:\Program Files\WindowsPowerShell\Modules;C:\Windows\system32\WindowsPowerShell\v1.0\Modules > CommonProgramW6432=C:\Program Files\Common Files > CommonProgramFiles(x86)=C:\Program Files (x86)\Common Files > LANG=en_US.UTF-8 > TZ=Asia/Seoul > HOSTNAME=DESKTOP-R1C768B > PUBLIC=C:\Users\Public > OLDPWD=/home/SEJONG > USERNAME=SEJONG > LOGONSERVER=\\DESKTOP-R1C768B > PROCESSOR_ARCHITECTURE=AMD64 > LOCALAPPDATA=C:\Users\SEJONG\AppData\Local > COMPUTERNAME=DESKTOP-R1C768B > USER=SEJONG > !::=::\ > SYSTEMDRIVE=C: > USERPROFILE=C:\Users\SEJONG > PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.CPL > SYSTEMROOT=C:\Windows > USERDOMAIN_ROAMINGPROFILE=DESKTOP-R1C768B > OneDriveCommercial=C:\Users\SEJONG\OneDrive - Sejong University > PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 165 Stepping 5, GenuineIntel > GNUPLOT_LIB=C:\Program Files\gnuplot\demo;C:\Program Files\gnuplot\demo\games;C:\Program Files\gnuplot\share > PWD=/home/SEJONG/petsc-3.18.1 > MSMPI_BIN=C:\Program Files\Microsoft MPI\Bin\ > HOME=/home/SEJONG > TMP=/tmp > OneDrive=C:\Users\SEJONG\OneDrive - Sejong University > ZES_ENABLE_SYSMAN=1 > !C:=C:\cygwin64\bin > PROCESSOR_REVISION=a505 > PROFILEREAD=true > PROMPT=$P$G > NUMBER_OF_PROCESSORS=16 > ProgramW6432=C:\Program Files > COMSPEC=C:\Windows\system32\cmd.exe > APPDATA=C:\Users\SEJONG\AppData\Roaming > SHELL=/bin/bash > TERM=xterm-256color > WINDIR=C:\Windows > ProgramData=C:\ProgramData > SHLVL=1 > PRINTER=\\210.107.220.119\HP Color LaserJet Pro MFP M377 PCL 6 > PROGRAMFILES=C:\Program Files > ALLUSERSPROFILE=C:\ProgramData > TEMP=/tmp > DriverData=C:\Windows\System32\Drivers\DriverData > SESSIONNAME=Console > ProgramFiles(x86)=C:\Program Files (x86) > PATH=/usr/local/bin:/usr/bin:/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program Files/Microsoft MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools:/usr/lib/lapack > PS1=\[\e]0;\w\a\]\n\[\e[32m\]\u@\h \[\e[33m\]\w\[\e[0m\]\n\$ > HOMEDRIVE=C: > INFOPATH=/usr/local/info:/usr/share/info:/usr/info > HOMEPATH=\Users\SEJONG > ORIGINAL_PATH=/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program Files/Microsoft MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools > EXECIGNORE=*.dll > _=./configure > Files in path provided by default path > /usr/local/bin: > /usr/bin: addftinfo.exe addr2line.exe apropos ar.exe arch.exe as.exe ash.exe awk b2sum.exe base32.exe base64.exe basename.exe basenc.exe bash.exe bashbug bomtool.exe bunzip2.exe bzcat.exe bzcmp bzdiff bzegrep bzfgrep bzgrep bzip2.exe bzip2recover.exe bzless bzmore c++.exe c++filt.exe c89 c99 ca-legacy cal.exe captoinfo cat.exe catman.exe cc ccmake.exe chattr.exe chcon.exe chgrp.exe chmod.exe chown.exe chroot.exe chrt.exe cksum.exe clear.exe cmake.exe cmp.exe col.exe colcrt.exe colrm.exe column.exe comm.exe cp.exe cpack.exe cpp.exe csplit.exe ctest.exe cut.exe cygarchive-13.dll cygargp-0.dll cygatomic-1.dll cygattr-1.dll cygblkid-1.dll cygbrotlicommon-1.dll cygbrotlidec-1.dll cygbz2-1.dll cygcheck.exe cygcom_err-2.dll cygcrypt-2.dll cygcrypto-1.1.dll cygcurl-4.dll cygdb-5.3.dll cygdb_cxx-5.3.dll cygdb_sql-5.3.dll cygedit-0.dll cygevent-2-1-7.dll cygevent_core-2-1-7.dll cygevent_extra-2-1-7.dll cygevent_openssl-2-1-7.dll cygevent_pthreads-2-1-7.dll cygexpat-1.dll cygfdisk-1.dll cygffi-6.dll cygfido2-1.dll cygformw-10.dll cyggc-1.dll cyggcc_s-seh-1.dll cyggdbm-6.dll cyggdbm_compat-4.dll cyggfortran-5.dll cyggmp-10.dll cyggomp-1.dll cyggsasl-7.dll cyggssapi_krb5-2.dll cygguile-2.2-1.dll cyghistory7.dll cyghwloc-15.dll cygiconv-2.dll cygidn-12.dll cygidn2-0.dll cygintl-8.dll cygisl-23.dll cygjsoncpp-25.dll cygk5crypto-3.dll cygkrb5-3.dll cygkrb5support-0.dll cyglber-2-4-2.dll cyglber-2.dll cygldap-2-4-2.dll cygldap-2.dll cygldap_r-2-4-2.dll cygltdl-7.dll cyglz4-1.dll cyglzma-5.dll cyglzo2-2.dll cygmagic-1.dll cygman-2-11-0.dll cygmandb-2-11-0.dll cygmenuw-10.dll cygmpc-3.dll cygmpfr-6.dll cygmpi-40.dll cygmpi_mpifh-40.dll cygmpi_usempif08-40.dll cygmpi_usempi_ignore_tkr-40.dll cygncursesw-10.dll cygnghttp2-14.dll cygntlm-0.dll cygopen-pal-40.dll cygopen-rte-40.dll cygp11-kit-0.dll cygpanelw-10.dll cygpath.exe cygpcre2-8-0.dll cygperl5_32.dll cygpipeline-1.dll cygpkgconf-4.dll cygpopt-0.dll cygpsl-5.dll cygquadmath-0.dll cygreadline7.dll cygrhash-0.dll cygrunsrv.exe cygsasl2-3.dll cygserver-config cygsigsegv-2.dll cygsmartcols-1.dll cygsqlite3-0.dll cygssh2-1.dll cygssl-1.1.dll cygstart.exe cygstdc++-6.dll cygtasn1-6.dll cygticw-10.dll cygunistring-2.dll cyguuid-1.dll cyguv-1.dll cygwin-console-helper.exe cygwin1.dll cygxml2-2.dll cygxxhash-0.dll cygz.dll cygzstd-1.dll dash.exe date.exe dd.exe df.exe diff.exe diff3.exe dir.exe dircolors.exe dirname.exe dlltool.exe dllwrap.exe dnsdomainname domainname du.exe dumper.exe echo.exe editrights.exe egrep elfedit.exe env.exe eqn.exe eqn2graph ex expand.exe expr.exe f95 factor.exe false.exe fgrep fido2-assert.exe fido2-cred.exe fido2-token.exe file.exe find.exe flock.exe fmt.exe fold.exe g++.exe gawk-5.1.1.exe gawk.exe gcc-ar.exe gcc-nm.exe gcc-ranlib.exe gcc.exe gcov-dump.exe gcov-tool.exe gcov.exe gdiffmk gencat.exe getconf.exe getent.exe getfacl.exe getopt.exe gfortran.exe git-receive-pack.exe git-shell.exe git-upload-archive.exe git-upload-pack.exe git.exe gkill.exe gmondump.exe gprof.exe grap2graph grep.exe grn.exe grodvi.exe groff.exe grolbp.exe grolj4.exe grops.exe grotty.exe groups.exe gunzip gzexe gzip.exe head.exe hexdump.exe hostid.exe hostname.exe hpftodit.exe i686-w64-mingw32-pkg-config id.exe indxbib.exe info.exe infocmp.exe infotocap install-info.exe install.exe ipcmk.exe ipcrm.exe ipcs.exe isosize.exe join.exe kill.exe lastlog.exe ld.bfd.exe ld.exe ldd.exe ldh.exe less.exe lessecho.exe lesskey.exe lexgrog.exe libpython3.9.dll link-cygin.exe lkbib.exe ln.exe locale.exe locate.exe logger.exe login.exe logname.exe look.exe lookbib.exe ls.exe lsattr.exe lto-dump.exe lzcat lzcmp lzdiff lzegrep lzfgrep lzgrep lzless lzma lzmadec.exe lzmainfo.exe lzmore make-dummy-cert make.exe man-recode.exe man.exe mandb.exe manpath.exe mcookie.exe md5sum.exe minidumper.exe mintheme mintty.exe mkdir.exe mkfifo.exe mkgroup.exe mknod.exe mkpasswd.exe mkshortcut.exe mktemp.exe more.exe mount.exe mpic++ mpicc mpicxx mpiexec mpif77 mpif90 mpifort mpirun mv.exe namei.exe neqn nice.exe nl.exe nm.exe nohup.exe nproc.exe nroff numfmt.exe objcopy.exe objdump.exe od.exe ompi-clean ompi-server ompi_info.exe opal_wrapper.exe openssl.exe orte-clean.exe orte-info.exe orte-server.exe ortecc orted.exe orterun.exe p11-kit.exe passwd.exe paste.exe pathchk.exe pdfroff peflags.exe peflagsall perl.exe perl5.32.1.exe pfbtops.exe pg.exe pic.exe pic2graph pinky.exe pip3 pip3.9 pkg-config pkgconf.exe pldd.exe post-grohtml.exe pr.exe pre-grohtml.exe preconv.exe printenv.exe printf.exe profiler.exe ps.exe ptx.exe pwd.exe pydoc3 pydoc3.9 python python3 python3.9.exe pzstd.exe ranlib.exe readelf.exe readlink.exe readshortcut.exe realpath.exe rebase-trigger rebase.exe rebaseall rebaselst refer.exe regtool.exe rename.exe renew-dummy-cert renice.exe reset rev.exe rm.exe rmdir.exe rsync-ssl rsync.exe run.exe runcon.exe rvi rview scalar.exe scp.exe script.exe scriptreplay.exe sdiff.exe sed.exe seq.exe setfacl.exe setmetamode.exe setsid.exe sftp.exe sh.exe sha1sum.exe sha224sum.exe sha256sum.exe sha384sum.exe sha512sum.exe shred.exe shuf.exe size.exe sleep.exe slogin soelim.exe sort.exe split.exe ssh-add.exe ssh-agent.exe ssh-copy-id ssh-host-config ssh-keygen.exe ssh-keyscan.exe ssh-user-config ssh.exe ssp.exe stat.exe stdbuf.exe strace.exe strings.exe strip.exe stty.exe sum.exe sync.exe tabs.exe tac.exe tail.exe tar.exe taskset.exe tbl.exe tee.exe test.exe tfmtodit.exe tic.exe timeout.exe toe.exe touch.exe tput.exe tr.exe troff.exe true.exe truncate.exe trust.exe tset.exe tsort.exe tty.exe tzselect tzset.exe ul.exe umount.exe uname.exe unexpand.exe uniq.exe unlink.exe unlzma unxz unzstd update-ca-trust update-crypto-policies updatedb users.exe uuidgen.exe uuidparse.exe vdir.exe vi.exe view wc.exe whatis.exe whereis.exe which.exe who.exe whoami.exe windmc.exe windres.exe x86_64-pc-cygwin-c++.exe x86_64-pc-cygwin-g++.exe x86_64-pc-cygwin-gcc-11.exe x86_64-pc-cygwin-gcc-ar.exe x86_64-pc-cygwin-gcc-nm.exe x86_64-pc-cygwin-gcc-ranlib.exe x86_64-pc-cygwin-gcc.exe x86_64-pc-cygwin-gfortran.exe x86_64-pc-cygwin-pkg-config x86_64-w64-mingw32-pkg-config xargs.exe xmlcatalog.exe xmllint.exe xz.exe xzcat xzcmp xzdec.exe xzdiff xzegrep xzfgrep xzgrep xzless xzmore yes.exe zcat zcmp zdiff zdump.exe zegrep zfgrep zforce zgrep zless zmore znew zstd.exe zstdcat zstdgrep zstdless zstdmt [.exe > /cygdrive/c/SIMULIA/Commands: abaqus.bat abq2018.bat abq_cae_open.bat abq_odb_open.bat > /cygdrive/c/Program Files/Microsoft MPI/Bin: mpiexec.exe mpitrace.man smpd.exe > provthrd.dll provtool.exe ProximityCommon.dll ProximityCommonPal.dll ProximityRtapiPal.dll ProximityService.dll ProximityServicePal.dll ProximityToast ProximityUxHost.exe prproc.exe prvdmofcomp.dll psapi.dll pscript.sep PSHED.DLL psisdecd.dll psisrndr.ax PSModuleDis > coveryProvider.dll psmodulediscoveryprovider.mof PsmServiceExtHost.dll psmsrv.dll psr.exe pstask.dll pstorec.dll pt-BR pt-PT ptpprov.dll puiapi.dll puiobj.dll PushToInstall.dll pwlauncher.dll pwlauncher.exe pwrshplugin.dll pwsso.dll qappsrv.exe qasf.dll qcap.dll qdv. > dll qdvd.dll qedit.dll qedwipes.dll qmgr.dll qprocess.exe QualityUpdateAssistant.dll quartz.dll Query.dll query.exe QuickActionsDataModel.dll quickassist.exe QuietHours.dll quser.exe qwave.dll qwinsta.exe RacEngn.dll racpldlg.dll radardt.dll radarrs.dll RADCUI.dll ra > s rasadhlp.dll rasapi32.dll rasauto.dll rasautou.exe raschap.dll raschapext.dll rasctrnm.h rasctrs.dll rascustom.dll rasdiag.dll rasdial.exe rasdlg.dll raserver.exe rasgcw.dll rasman.dll rasmans.dll rasmbmgr.dll RasMediaManager.dll RASMM.dll rasmontr.dll rasphone.exe > rasplap.dll rasppp.dll rastapi.dll rastls.dll rastlsext.dll RasToast rdbui.dll rdpbase.dll rdpcfgex.dll rdpclip.exe rdpcore.dll rdpcorets.dll rdpcredentialprovider.dll rdpencom.dll rdpendp.dll rdpinit.exe rdpinput.exe rdpnano.dll RdpRelayTransport.dll RdpSa.exe RdpS > aProxy.exe RdpSaPs.dll RdpSaUacHelper.exe rdpserverbase.dll rdpsharercom.dll rdpshell.exe rdpsign.exe rdpudd.dll rdpviewerax.dll rdrleakdiag.exe RDSAppXHelper.dll rdsdwmdr.dll rdsxvmaudio.dll rdvvmtransport.dll RDXService.dll RDXTaskFactory.dll ReAgent.dll ReAgentc.e > xe ReAgentTask.dll recdisc.exe recover.exe Recovery recovery.dll RecoveryDrive.exe refsutil.exe reg.exe regapi.dll RegCtrl.dll regedt32.exe regidle.dll regini.exe Register-CimProvider.exe regsvc.dll regsvr32.exe reguwpapi.dll ReInfo.dll rekeywiz.exe relog.exe RelPost > .exe RemoteAppLifetimeManager.exe RemoteAppLifetimeManagerProxyStub.dll remoteaudioendpoint.dll remotepg.dll RemotePosWorker.exe remotesp.tsp RemoteSystemToastIcon.contrast-white.png RemoteSystemToastIcon.png RemoteWipeCSP.dll RemovableMediaProvisioningPlugin.dll Rem > oveDeviceContextHandler.dll RemoveDeviceElevated.dll rendezvousSession.tlb repair-bde.exe replace.exe ReportingCSP.dll RESAMPLEDMO.DLL ResBParser.dll reset.exe reseteng.dll ResetEngine.dll ResetEngine.exe ResetEngOnline.dll resmon.exe ResourceMapper.dll ResourcePolic > yClient.dll ResourcePolicyServer.dll ResPriHMImageList ResPriHMImageListLowCost ResPriImageList ResPriImageListLowCost RestartManager.mof RestartManagerUninstall.mof RestartNowPower_80.contrast-black.png RestartNowPower_80.contrast-white.png RestartNowPower_80.png Re > startTonight_80.png RestartTonight_80_contrast-black.png RestartTonight_80_contrast-white.png restore resutils.dll rgb9rast.dll Ribbons.scr riched20.dll riched32.dll rilproxy.dll RjvMDMConfig.dll RMActivate.exe RMActivate_isv.exe RMActivate_ssp.exe RMActivate_ssp_isv > .exe RMapi.dll rmclient.dll RmClient.exe RMSRoamingSecurity.dll rmttpmvscmgrsvr.exe rnr20.dll ro-RO RoamingSecurity.dll Robocopy.exe rometadata.dll RotMgr.dll ROUTE.EXE RpcEpMap.dll rpchttp.dll RpcNs4.dll rpcnsh.dll RpcPing.exe rpcrt4.dll RpcRtRemote.dll rpcss.dll rr > installer.exe rsaenh.dll rshx32.dll rsop.msc RstMwEventLogMsg.dll RstrtMgr.dll rstrui.exe RtCOM64.dll RtDataProc64.dll rtffilt.dll RtkApi64U.dll RtkAudUService64.exe RtkCfg64.dll rtm.dll rtmcodecs.dll RTMediaFrame.dll rtmmvrortc.dll rtmpal.dll rtmpltfm.dll rtutils.dl > l RTWorkQ.dll ru-RU RuleBasedDS.dll runas.exe rundll32.exe runexehelper.exe RunLegacyCPLElevated.exe runonce.exe RuntimeBroker.exe rwinsta.exe samcli.dll samlib.dll samsrv.dll Samsung sas.dll sbe.dll sbeio.dll sberes.dll sbservicetrigger.dll sc.exe ScanPlugin.dll sca > nsetting.dll SCardBi.dll SCardDlg.dll SCardSvr.dll ScavengeSpace.xml scavengeui.dll ScDeviceEnum.dll scecli.dll scesrv.dll schannel.dll schedcli.dll schedsvc.dll ScheduleTime_80.contrast-black.png ScheduleTime_80.contrast-white.png ScheduleTime_80.png schtasks.exe sc > ksp.dll scripto.dll ScriptRunner.exe scrnsave.scr scrobj.dll scrptadm.dll scrrun.dll sdbinst.exe sdchange.exe sdclt.exe sdcpl.dll SDDS.dll sdengin2.dll SDFHost.dll sdhcinst.dll sdiageng.dll sdiagnhost.exe sdiagprv.dll sdiagschd.dll sdohlp.dll sdrsvc.dll sdshext.dll S > earch.ProtocolHandler.MAPI2.dll SearchFilterHost.exe SearchFolder.dll SearchIndexer.exe SearchProtocolHost.exe SebBackgroundManagerPolicy.dll SecConfig.efi SecEdit.exe sechost.dll secinit.exe seclogon.dll secpol.msc secproc.dll secproc_isv.dll secproc_ssp.dll secproc > _ssp_isv.dll secur32.dll SecureAssessmentHandlers.dll SecureBootUpdates securekernel.exe SecureTimeAggregator.dll security.dll SecurityAndMaintenance.png SecurityAndMaintenance_Alert.png SecurityAndMaintenance_Error.png SecurityCenterBroker.dll SecurityCenterBrokerPS > .dll SecurityHealthAgent.dll SecurityHealthHost.exe SecurityHealthProxyStub.dll SecurityHealthService.exe SecurityHealthSSO.dll SecurityHealthSystray.exe sedplugins.dll SEMgrPS.dll SEMgrSvc.dll sendmail.dll Sens.dll SensApi.dll SensorDataService.exe SensorPerformance > Events.dll SensorsApi.dll SensorsClassExtension.dll SensorsCpl.dll SensorService.dll SensorsNativeApi.dll SensorsNativeApi.V2.dll SensorsUtilsV2.dll sensrsvc.dll serialui.dll services.exe services.msc ServicingUAPI.dll serwvdrv.dll SessEnv.dll sessionmsg.exe setbcdlo > cale.dll sethc.exe SetNetworkLocation.dll SetNetworkLocationFlyout.dll SetProxyCredential.dll setspn.exe SettingMonitor.dll settings.dat SettingsEnvironment.Desktop.dll SettingsExtensibilityHandlers.dll SettingsHandlers_Accessibility.dll SettingsHandlers_AnalogShell. > dll SettingsHandlers_AppControl.dll SettingsHandlers_AppExecutionAlias.dll SettingsHandlers_AssignedAccess.dll SettingsHandlers_Authentication.dll SettingsHandlers_BackgroundApps.dll SettingsHandlers_BatteryUsage.dll SettingsHandlers_BrowserDeclutter.dll SettingsHand > lers_CapabilityAccess.dll SettingsHandlers_Clipboard.dll SettingsHandlers_ClosedCaptioning.dll SettingsHandlers_ContentDeliveryManager.dll SettingsHandlers_Cortana.dll SettingsHandlers_Devices.dll SettingsHandlers_Display.dll SettingsHandlers_Flights.dll SettingsHand > lers_Fonts.dll SettingsHandlers_ForceSync.dll SettingsHandlers_Gaming.dll SettingsHandlers_Geolocation.dll SettingsHandlers_Gpu.dll SettingsHandlers_HoloLens_Environment.dll SettingsHandlers_IME.dll SettingsHandlers_InkingTypingPrivacy.dll SettingsHandlers_InputPerso > nalization.dll SettingsHandlers_Language.dll SettingsHandlers_ManagePhone.dll SettingsHandlers_Maps.dll SettingsHandlers_Mouse.dll SettingsHandlers_Notifications.dll SettingsHandlers_nt.dll SettingsHandlers_OneCore_BatterySaver.dll SettingsHandlers_OneCore_PowerAndSl > eep.dll SettingsHandlers_OneDriveBackup.dll SettingsHandlers_OptionalFeatures.dll SettingsHandlers_PCDisplay.dll SettingsHandlers_Pen.dll SettingsHandlers_QuickActions.dll SettingsHandlers_Region.dll SettingsHandlers_SharedExperiences_Rome.dll SettingsHandlers_SIUF.d > ll SettingsHandlers_SpeechPrivacy.dll SettingsHandlers_Startup.dll SettingsHandlers_StorageSense.dll SettingsHandlers_Troubleshoot.dll SettingsHandlers_User.dll SettingsHandlers_UserAccount.dll SettingsHandlers_UserExperience.dll SettingsHandlers_WorkAccess.dll Setti > ngSync.dll SettingSyncCore.dll SettingSyncDownloadHelper.dll SettingSyncHost.exe setup setupapi.dll setupcl.dll setupcl.exe setupcln.dll setupetw.dll setupugc.exe setx.exe sfc.dll sfc.exe sfc_os.dll Sgrm SgrmBroker.exe SgrmEnclave.dll SgrmEnclave_secure.dll SgrmLpac. > exe shacct.dll shacctprofile.dll SharedPCCSP.dll SharedRealitySvc.dll ShareHost.dll sharemediacpl.dll SHCore.dll shdocvw.dll shell32.dll ShellAppRuntime.exe ShellCommonCommonProxyStub.dll ShellExperiences shellstyle.dll shfolder.dll shgina.dll ShiftJIS.uce shimeng.dl > l shimgvw.dll shlwapi.dll shpafact.dll shrpubw.exe shsetup.dll shsvcs.dll shunimpl.dll shutdown.exe shutdownext.dll shutdownux.dll shwebsvc.dll si-lk signdrv.dll sigverif.exe SIHClient.exe sihost.exe SimAuth.dll SimCfg.dll simpdata.tlb sk-SK skci.dll sl-SI slc.dll sl > cext.dll SleepStudy SlideToShutDown.exe slmgr slmgr.vbs slui.exe slwga.dll SmallRoom.bin SmartCardBackgroundPolicy.dll SmartcardCredentialProvider.dll SmartCardSimulator.dll smartscreen.exe smartscreenps.dll SMBHelperClass.dll smbwmiv2.dll SMI SmiEngine.dll smphost.d > ll SmsRouterSvc.dll smss.exe SndVol.exe SndVolSSO.dll SnippingTool.exe snmpapi.dll snmptrap.exe Snooze_80.contrast-black.png Snooze_80.contrast-white.png Snooze_80.png socialapis.dll softkbd.dll softpub.dll sort.exe SortServer2003Compat.dll SortWindows61.dll SortWind > ows62.dll SortWindows64.dll SortWindows6Compat.dll SpaceAgent.exe spacebridge.dll SpaceControl.dll spaceman.exe SpatialAudioLicenseSrv.exe SpatializerApo.dll SpatialStore.dll spbcd.dll SpeakersSystemToastIcon.contrast-white.png SpeakersSystemToastIcon.png Spectrum.ex > e SpectrumSyncClient.dll Speech SpeechPal.dll Speech_OneCore spfileq.dll spinf.dll spmpm.dll spnet.dll spool spoolss.dll spoolsv.exe spopk.dll spp spp.dll sppc.dll sppcext.dll sppcomapi.dll sppcommdlg.dll SppExtComObj.Exe sppinst.dll sppnp.dll sppobjs.dll sppsvc.exe > sppui sppwinob.dll sppwmi.dll spwinsat.dll spwizeng.dll spwizimg.dll spwizres.dll spwmp.dll SqlServerSpatial130.dll SqlServerSpatial150.dll sqlsrv32.dll sqlsrv32.rll sqmapi.dll sr-Latn-RS srchadmin.dll srclient.dll srcore.dll srdelayed.exe SrEvents.dll SRH.dll srhelp > er.dll srm.dll srmclient.dll srmlib.dll srms-apr-v.dat srms-apr.dat srms.dat srmscan.dll srmshell.dll srmstormod.dll srmtrace.dll srm_ps.dll srpapi.dll SrpUxNativeSnapIn.dll srrstr.dll SrTasks.exe sru srumapi.dll srumsvc.dll srvcli.dll srvsvc.dll srwmi.dll sscore.dll > sscoreext.dll ssdm.dll ssdpapi.dll ssdpsrv.dll sspicli.dll sspisrv.dll SSShim.dll ssText3d.scr sstpsvc.dll StartTileData.dll Startupscan.dll StateRepository.Core.dll stclient.dll stdole2.tlb stdole32.tlb sti.dll sti_ci.dll stobject.dll StorageContextHandler.dll Stor > ageUsage.dll storagewmi.dll storagewmi_passthru.dll stordiag.exe storewuauth.dll Storprop.dll StorSvc.dll streamci.dll StringFeedbackEngine.dll StructuredQuery.dll SubRange.uce subst.exe sud.dll sv-SE SvBannerBackground.png svchost.exe svf.dll svsvc.dll SwitcherDataM > odel.dll swprv.dll sxproxy.dll sxs.dll sxshared.dll sxssrv.dll sxsstore.dll sxstrace.exe SyncAppvPublishingServer.exe SyncAppvPublishingServer.vbs SyncCenter.dll SyncController.dll SyncHost.exe SyncHostps.dll SyncInfrastructure.dll SyncInfrastructureps.dll SyncProxy. > dll Syncreg.dll SyncRes.dll SyncSettings.dll syncutil.dll sysclass.dll sysdm.cpl SysFxUI.dll sysmain.dll sysmon.ocx sysntfy.dll Sysprep sysprint.sep sysprtj.sep SysResetErr.exe syssetup.dll systemcpl.dll SystemEventsBrokerClient.dll SystemEventsBrokerServer.dll syste > minfo.exe SystemPropertiesAdvanced.exe SystemPropertiesComputerName.exe SystemPropertiesDataExecutionPrevention.exe SystemPropertiesHardware.exe SystemPropertiesPerformance.exe SystemPropertiesProtection.exe SystemPropertiesRemote.exe systemreset.exe SystemResetPlatf > orm SystemSettings.DataModel.dll SystemSettings.DeviceEncryptionHandlers.dll SystemSettings.Handlers.dll SystemSettings.SettingsExtensibility.dll SystemSettings.UserAccountsHandlers.dll SystemSettingsAdminFlows.exe SystemSettingsBroker.exe SystemSettingsRemoveDevice. > exe SystemSettingsThresholdAdminFlowUI.dll SystemSupportInfo.dll SystemUWPLauncher.exe systray.exe t2embed.dll ta-in ta-lk Tabbtn.dll TabbtnEx.dll tabcal.exe TabletPC.cpl TabSvc.dll takeown.exe tapi3.dll tapi32.dll tapilua.dll TapiMigPlugin.dll tapiperf.dll tapisrv.d > ll TapiSysprep.dll tapiui.dll TapiUnattend.exe tar.exe TaskApis.dll taskbarcpl.dll taskcomp.dll TaskFlowDataEngine.dll taskhostw.exe taskkill.exe tasklist.exe Taskmgr.exe Tasks taskschd.dll taskschd.msc TaskSchdPS.dll tbauth.dll tbs.dll tcblaunch.exe tcbloader.dll tc > msetup.exe tcpbidi.xml tcpipcfg.dll tcpmib.dll tcpmon.dll tcpmon.ini tcpmonui.dll TCPSVCS.EXE tdc.ocx tdh.dll TDLMigration.dll TEEManagement64.dll telephon.cpl TelephonyInteractiveUser.dll TelephonyInteractiveUserRes.dll tellib.dll TempSignedLicenseExchangeTask.dll T > enantRestrictionsPlugin.dll termmgr.dll termsrv.dll tetheringclient.dll tetheringconfigsp.dll TetheringIeProvider.dll TetheringMgr.dll tetheringservice.dll TetheringStation.dll TextInputFramework.dll TextInputMethodFormatter.dll TextShaping.dll th-TH themecpl.dll The > mes.SsfDownload.ScheduledTask.dll themeservice.dll themeui.dll ThirdPartyNoticesBySHS.txt threadpoolwinrt.dll thumbcache.dll ThumbnailExtractionHost.exe ti-et tier2punctuations.dll TieringEngineProxy.dll TieringEngineService.exe TileDataRepository.dll TimeBrokerClien > t.dll TimeBrokerServer.dll timedate.cpl TimeDateMUICallback.dll timeout.exe timesync.dll TimeSyncTask.dll TKCtrl2k64.sys TKFsAv64.sys TKFsFt64.sys TKFWFV.inf TKFWFV64.cat TKFWFV64.sys tkfwvt64.sys TKIdsVt64.sys TKPcFtCb64.sys TKPcFtCb64.sys_ TKPcFtHk64.sys TKRgAc2k64 > .sys TKRgFtXp64.sys TKTool2k.sys TKTool2k64.sys tlscsp.dll tokenbinding.dll TokenBroker.dll TokenBrokerCookies.exe TokenBrokerUI.dll tpm.msc TpmCertResources.dll tpmcompc.dll TpmCoreProvisioning.dll TpmInit.exe TpmTasks.dll TpmTool.exe tpmvsc.dll tpmvscmgr.exe tpmvsc > mgrsvr.exe tquery.dll tr-TR tracerpt.exe TRACERT.EXE traffic.dll TransformPPSToWlan.xslt TransformPPSToWlanCredentials.xslt TransliterationRanker.dll TransportDSA.dll tree.com trie.dll trkwks.dll TrustedSignalCredProv.dll tsbyuv.dll tscfgwmi.dll tscon.exe tsdiscon.ex > e TSErrRedir.dll tsf3gip.dll tsgqec.dll tskill.exe tsmf.dll TSpkg.dll tspubwmi.dll TSSessionUX.dll tssrvlic.dll TSTheme.exe TsUsbGDCoInstaller.dll TsUsbRedirectionGroupPolicyExtension.dll TSWbPrxy.exe TSWorkspace.dll TsWpfWrp.exe ttdinject.exe ttdloader.dll ttdplm.dl > l ttdrecord.dll ttdrecordcpu.dll TtlsAuth.dll TtlsCfg.dll TtlsExt.dll tttracer.exe tvratings.dll twext.dll twinapi.appcore.dll twinapi.dll twinui.appcore.dll twinui.dll twinui.pcshell.dll txflog.dll txfw32.dll typeperf.exe tzautoupdate.dll tzres.dll tzsync.exe tzsync > res.dll tzutil.exe ubpm.dll ucmhc.dll ucrtbase.dll ucrtbased.dll ucrtbase_clr0400.dll ucrtbase_enclave.dll ucsvc.exe udhisapi.dll uDWM.dll UefiCsp.dll UevAgentPolicyGenerator.exe UevAppMonitor.exe UevAppMonitor.exe.config UevCustomActionTypes.tlb UevTemplateBaselineG > enerator.exe UevTemplateConfigItemGenerator.exe uexfat.dll ufat.dll UiaManager.dll UIAnimation.dll UIAutomationCore.dll uicom.dll UIManagerBrokerps.dll UIMgrBroker.exe uireng.dll UIRibbon.dll UIRibbonRes.dll uk-UA ulib.dll umb.dll umdmxfrm.dll umpdc.dll umpnpmgr.dll > umpo-overrides.dll umpo.dll umpoext.dll umpowmi.dll umrdp.dll unattend.dll unenrollhook.dll unimdm.tsp unimdmat.dll uniplat.dll Unistore.dll unlodctr.exe UNP unregmp2.exe untfs.dll UpdateAgent.dll updatecsp.dll UpdateDeploymentProvider.dll UpdateHeartbeat.dll updatep > olicy.dll upfc.exe UpgradeResultsUI.exe upnp.dll upnpcont.exe upnphost.dll UPPrinterInstaller.exe UPPrinterInstallsCSP.dll upshared.dll uReFS.dll uReFSv1.dll ureg.dll url.dll urlmon.dll UsbCApi.dll usbceip.dll usbmon.dll usbperf.dll UsbPmApi.dll UsbSettingsHandlers.d > ll UsbTask.dll usbui.dll user32.dll UserAccountBroker.exe UserAccountControlSettings.dll UserAccountControlSettings.exe useractivitybroker.dll usercpl.dll UserDataAccessRes.dll UserDataAccountApis.dll UserDataLanguageUtil.dll UserDataPlatformHelperUtil.dll UserDataSe > rvice.dll UserDataTimeUtil.dll UserDataTypeHelperUtil.dll UserDeviceRegistration.dll UserDeviceRegistration.Ngc.dll userenv.dll userinit.exe userinitext.dll UserLanguageProfileCallback.dll usermgr.dll usermgrcli.dll UserMgrProxy.dll usk.rs usoapi.dll UsoClient.exe us > ocoreps.dll usocoreworker.exe usosvc.dll usp10.dll ustprov.dll UtcDecoderHost.exe UtcManaged.dll utcutil.dll utildll.dll Utilman.exe uudf.dll UvcModel.dll uwfcfgmgmt.dll uwfcsp.dll uwfservicingapi.dll UXInit.dll uxlib.dll uxlibres.dll uxtheme.dll vac.dll VAN.dll Vaul > t.dll VaultCDS.dll vaultcli.dll VaultCmd.exe VaultRoaming.dll vaultsvc.dll VBICodec.ax vbisurf.ax vbsapi.dll vbscript.dll vbssysprep.dll vcamp120.dll vcamp140.dll vcamp140d.dll VCardParser.dll vccorlib110.dll vccorlib120.dll vccorlib140.dll vccorlib140d.dll vcomp100. > dll vcomp110.dll vcomp120.dll vcomp140.dll vcomp140d.dll vcruntime140.dll vcruntime140d.dll vcruntime140_1.dll vcruntime140_1d.dll vcruntime140_clr0400.dll vds.exe vdsbas.dll vdsdyn.dll vdsldr.exe vdsutil.dll vdsvd.dll vds_ps.dll verclsid.exe verifier.dll verifier.ex > e verifiergui.exe version.dll vertdll.dll vfbasics.dll vfcompat.dll vfcuzz.dll vfluapriv.dll vfnet.dll vfntlmless.dll vfnws.dll vfprint.dll vfprintpthelper.dll vfrdvcompat.dll vfuprov.dll vfwwdm32.dll VhfUm.dll vid.dll vidcap.ax VideoHandlers.dll VIDRESZR.DLL virtdis > k.dll VirtualMonitorManager.dll VmApplicationHealthMonitorProxy.dll vmbuspipe.dll vmdevicehost.dll vmictimeprovider.dll vmrdvcore.dll VocabRoamingHandler.dll VoiceActivationManager.dll VoipRT.dll vpnike.dll vpnikeapi.dll VpnSohDesktop.dll VPNv2CSP.dll vrfcore.dll Vsc > MgrPS.dll vscover160.dll VSD3DWARPDebug.dll VsGraphicsCapture.dll VsGraphicsDesktopEngine.exe VsGraphicsExperiment.dll VsGraphicsHelper.dll VsGraphicsProxyStub.dll VsGraphicsRemoteEngine.exe vsjitdebugger.exe VSPerf160.dll vssadmin.exe vssapi.dll vsstrace.dll VSSVC.e > xe vss_ps.dll vulkan-1-999-0-0-0.dll vulkan-1.dll vulkaninfo-1-999-0-0-0.exe vulkaninfo.exe w32time.dll w32tm.exe w32topl.dll WaaSAssessment.dll WaaSMedicAgent.exe WaaSMedicCapsule.dll WaaSMedicPS.dll WaaSMedicSvc.dll WABSyncProvider.dll waitfor.exe WalletBackgroundS > erviceProxy.dll WalletProxy.dll WalletService.dll WallpaperHost.exe wavemsp.dll wbadmin.exe wbem wbemcomn.dll wbengine.exe wbiosrvc.dll wci.dll wcimage.dll wcmapi.dll wcmcsp.dll wcmsvc.dll WCN WcnApi.dll wcncsvc.dll WcnEapAuthProxy.dll WcnEapPeerProxy.dll WcnNetsh.dl > l wcnwiz.dll wc_storage.dll wdc.dll WDI wdi.dll wdigest.dll wdmaud.drv wdscore.dll WdsUnattendTemplate.xml WEB.rs webauthn.dll WebcamUi.dll webcheck.dll WebClnt.dll webio.dll webplatstorageserver.dll WebRuntimeManager.dll webservices.dll Websocket.dll wecapi.dll wecs > vc.dll wecutil.exe wephostsvc.dll wer.dll werconcpl.dll wercplsupport.dll werdiagcontroller.dll WerEnc.dll weretw.dll WerFault.exe WerFaultSecure.exe wermgr.exe wersvc.dll werui.dll wevtapi.dll wevtfwd.dll wevtsvc.dll wevtutil.exe wextract.exe WF.msc wfapigp.dll wfdp > rov.dll WFDSConMgr.dll WFDSConMgrSvc.dll WfHC.dll WFS.exe WFSR.dll whealogr.dll where.exe whhelper.dll whoami.exe wiaacmgr.exe wiaaut.dll wiadefui.dll wiadss.dll WiaExtensionHost64.dll wiarpc.dll wiascanprofiles.dll wiaservc.dll wiashext.dll wiatrace.dll wiawow64.exe > WiFiCloudStore.dll WiFiConfigSP.dll wifidatacapabilityhandler.dll WiFiDisplay.dll wifinetworkmanager.dll wifitask.exe WimBootCompress.ini wimgapi.dll wimserv.exe win32appinventorycsp.dll Win32AppSettingsProvider.dll Win32CompatibilityAppraiserCSP.dll win32k.sys win3 > 2kbase.sys win32kfull.sys win32kns.sys win32spl.dll win32u.dll Win32_DeviceGuard.dll winbio.dll WinBioDatabase WinBioDataModel.dll WinBioDataModelOOBE.exe winbioext.dll WinBioPlugIns winbrand.dll wincorlib.dll wincredprovider.dll wincredui.dll WindowManagement.dll Wi > ndowManagementAPI.dll Windows.AccountsControl.dll Windows.AI.MachineLearning.dll Windows.AI.MachineLearning.Preview.dll Windows.ApplicationModel.Background.SystemEventsBroker.dll Windows.ApplicationModel.Background.TimeBroker.dll Windows.ApplicationModel.Conversation > alAgent.dll windows.applicationmodel.conversationalagent.internal.proxystub.dll windows.applicationmodel.conversationalagent.proxystub.dll Windows.ApplicationModel.Core.dll windows.applicationmodel.datatransfer.dll Windows.ApplicationModel.dll Windows.ApplicationMode > l.LockScreen.dll Windows.ApplicationModel.Store.dll Windows.ApplicationModel.Store.Preview.DOSettings.dll Windows.ApplicationModel.Store.TestingFramework.dll Windows.ApplicationModel.Wallet.dll Windows.CloudStore.dll Windows.CloudStore.Schema.DesktopShell.dll Windows > .CloudStore.Schema.Shell.dll Windows.Cortana.Desktop.dll Windows.Cortana.OneCore.dll Windows.Cortana.ProxyStub.dll Windows.Data.Activities.dll Windows.Data.Pdf.dll Windows.Devices.AllJoyn.dll Windows.Devices.Background.dll Windows.Devices.Background.ps.dll Windows.De > vices.Bluetooth.dll Windows.Devices.Custom.dll Windows.Devices.Custom.ps.dll Windows.Devices.Enumeration.dll Windows.Devices.Haptics.dll Windows.Devices.HumanInterfaceDevice.dll Windows.Devices.Lights.dll Windows.Devices.LowLevel.dll Windows.Devices.Midi.dll Windows. > Devices.Perception.dll Windows.Devices.Picker.dll Windows.Devices.PointOfService.dll Windows.Devices.Portable.dll Windows.Devices.Printers.dll Windows.Devices.Printers.Extensions.dll Windows.Devices.Radios.dll Windows.Devices.Scanners.dll Windows.Devices.Sensors.dll > Windows.Devices.SerialCommunication.dll Windows.Devices.SmartCards.dll Windows.Devices.SmartCards.Phone.dll Windows.Devices.Usb.dll Windows.Devices.WiFi.dll Windows.Devices.WiFiDirect.dll Windows.Energy.dll Windows.FileExplorer.Common.dll Windows.Gaming.Input.dll Win > dows.Gaming.Preview.dll Windows.Gaming.UI.GameBar.dll Windows.Gaming.XboxLive.Storage.dll Windows.Globalization.dll Windows.Globalization.Fontgroups.dll Windows.Globalization.PhoneNumberFormatting.dll Windows.Graphics.Display.BrightnessOverride.dll Windows.Graphics.D > isplay.DisplayEnhancementOverride.dll Windows.Graphics.dll Windows.Graphics.Printing.3D.dll Windows.Graphics.Printing.dll Windows.Graphics.Printing.Workflow.dll Windows.Graphics.Printing.Workflow.Native.dll Windows.Help.Runtime.dll windows.immersiveshell.serviceprovi > der.dll Windows.Internal.AdaptiveCards.XamlCardRenderer.dll Windows.Internal.Bluetooth.dll Windows.Internal.CapturePicker.Desktop.dll Windows.Internal.CapturePicker.dll Windows.Internal.Devices.Sensors.dll Windows.Internal.Feedback.Analog.dll Windows.Internal.Feedbac > k.Analog.ProxyStub.dll Windows.Internal.Graphics.Display.DisplayColorManagement.dll Windows.Internal.Graphics.Display.DisplayEnhancementManagement.dll Windows.Internal.Management.dll Windows.Internal.Management.SecureAssessment.dll Windows.Internal.PlatformExtension. > DevicePickerExperience.dll Windows.Internal.PlatformExtension.MiracastBannerExperience.dll Windows.Internal.PredictionUnit.dll Windows.Internal.Security.Attestation.DeviceAttestation.dll Windows.Internal.SecurityMitigationsBroker.dll Windows.Internal.Shell.Broker.dll > windows.internal.shellcommon.AccountsControlExperience.dll windows.internal.shellcommon.AppResolverModal.dll Windows.Internal.ShellCommon.Broker.dll windows.internal.shellcommon.FilePickerExperienceMEM.dll Windows.Internal.ShellCommon.PrintExperience.dll windows.int > ernal.shellcommon.shareexperience.dll windows.internal.shellcommon.TokenBrokerModal.dll Windows.Internal.Signals.dll Windows.Internal.System.UserProfile.dll Windows.Internal.Taskbar.dll Windows.Internal.UI.BioEnrollment.ProxyStub.dll Windows.Internal.UI.Logon.ProxySt > ub.dll Windows.Internal.UI.Shell.WindowTabManager.dll Windows.Management.EnrollmentStatusTracking.ConfigProvider.dll Windows.Management.InprocObjects.dll Windows.Management.ModernDeployment.ConfigProviders.dll Windows.Management.Provisioning.ProxyStub.dll Windows.Man > agement.SecureAssessment.CfgProvider.dll Windows.Management.SecureAssessment.Diagnostics.dll Windows.Management.Service.dll Windows.Management.Workplace.dll Windows.Management.Workplace.WorkplaceSettings.dll Windows.Media.Audio.dll Windows.Media.BackgroundMediaPlayba > ck.dll Windows.Media.BackgroundPlayback.exe Windows.Media.Devices.dll Windows.Media.dll Windows.Media.Editing.dll Windows.Media.FaceAnalysis.dll Windows.Media.Import.dll Windows.Media.MediaControl.dll Windows.Media.MixedRealityCapture.dll Windows.Media.Ocr.dll Window > s.Media.Playback.BackgroundMediaPlayer.dll Windows.Media.Playback.MediaPlayer.dll Windows.Media.Playback.ProxyStub.dll Windows.Media.Protection.PlayReady.dll Windows.Media.Renewal.dll Windows.Media.Speech.dll Windows.Media.Speech.UXRes.dll Windows.Media.Streaming.dll > Windows.Media.Streaming.ps.dll Windows.Mirage.dll Windows.Mirage.Internal.Capture.Pipeline.ProxyStub.dll Windows.Mirage.Internal.dll Windows.Networking.BackgroundTransfer.BackgroundManagerPolicy.dll Windows.Networking.BackgroundTransfer.ContentPrefetchTask.dll Windo > ws.Networking.BackgroundTransfer.dll Windows.Networking.Connectivity.dll Windows.Networking.dll Windows.Networking.HostName.dll Windows.Networking.NetworkOperators.ESim.dll Windows.Networking.NetworkOperators.HotspotAuthentication.dll Windows.Networking.Proximity.dll > Windows.Networking.ServiceDiscovery.Dnssd.dll Windows.Networking.Sockets.PushEnabledApplication.dll Windows.Networking.UX.EapRequestHandler.dll Windows.Networking.Vpn.dll Windows.Networking.XboxLive.ProxyStub.dll Windows.Payments.dll Windows.Perception.Stub.dll Wind > ows.Security.Authentication.Identity.Provider.dll Windows.Security.Authentication.OnlineId.dll Windows.Security.Authentication.Web.Core.dll Windows.Security.Credentials.UI.CredentialPicker.dll Windows.Security.Credentials.UI.UserConsentVerifier.dll Windows.Security.I > ntegrity.dll Windows.Services.TargetedContent.dll Windows.SharedPC.AccountManager.dll Windows.SharedPC.CredentialProvider.dll Windows.Shell.BlueLightReduction.dll Windows.Shell.ServiceHostBuilder.dll Windows.Shell.StartLayoutPopulationEvents.dll Windows.StateReposito > ry.dll Windows.StateRepositoryBroker.dll Windows.StateRepositoryClient.dll Windows.StateRepositoryCore.dll Windows.StateRepositoryPS.dll Windows.StateRepositoryUpgrade.dll Windows.Storage.ApplicationData.dll Windows.Storage.Compression.dll windows.storage.dll Windows > .Storage.OneCore.dll Windows.Storage.Search.dll Windows.System.Diagnostics.dll Windows.System.Diagnostics.Telemetry.PlatformTelemetryClient.dll Windows.System.Diagnostics.TraceReporting.PlatformDiagnosticActions.dll Windows.System.Launcher.dll Windows.System.Profile. > HardwareId.dll Windows.System.Profile.PlatformDiagnosticsAndUsageDataSettings.dll Windows.System.Profile.RetailInfo.dll Windows.System.Profile.SystemId.dll Windows.System.Profile.SystemManufacturers.dll Windows.System.RemoteDesktop.dll Windows.System.SystemManagement > .dll Windows.System.UserDeviceAssociation.dll Windows.System.UserProfile.DiagnosticsSettings.dll Windows.UI.Accessibility.dll Windows.UI.AppDefaults.dll Windows.UI.BioFeedback.dll Windows.UI.BlockedShutdown.dll Windows.UI.Core.TextInput.dll Windows.UI.Cred.dll Window > s.UI.CredDialogController.dll Windows.UI.dll Windows.UI.FileExplorer.dll Windows.UI.Immersive.dll Windows.UI.Input.Inking.Analysis.dll Windows.UI.Input.Inking.dll Windows.UI.Internal.Input.ExpressiveInput.dll Windows.UI.Internal.Input.ExpressiveInput.Resource.dll Win > dows.UI.Logon.dll Windows.UI.NetworkUXController.dll Windows.UI.PicturePassword.dll Windows.UI.Search.dll Windows.UI.Shell.dll Windows.UI.Shell.Internal.AdaptiveCards.dll Windows.UI.Storage.dll Windows.UI.Xaml.Controls.dll Windows.UI.Xaml.dll Windows.UI.Xaml.InkContr > ols.dll Windows.UI.Xaml.Maps.dll Windows.UI.Xaml.Phone.dll Windows.UI.Xaml.Resources.19h1.dll Windows.UI.Xaml.Resources.Common.dll Windows.UI.Xaml.Resources.rs1.dll Windows.UI.Xaml.Resources.rs2.dll Windows.UI.Xaml.Resources.rs3.dll Windows.UI.Xaml.Resources.rs4.dll > Windows.UI.Xaml.Resources.rs5.dll Windows.UI.Xaml.Resources.th.dll Windows.UI.Xaml.Resources.win81.dll Windows.UI.Xaml.Resources.win8rtm.dll Windows.UI.XamlHost.dll Windows.WARP.JITService.dll Windows.WARP.JITService.exe Windows.Web.Diagnostics.dll Windows.Web.dll Wi > ndows.Web.Http.dll WindowsActionDialog.exe WindowsCodecs.dll WindowsCodecsExt.dll WindowsCodecsRaw.dll WindowsCodecsRaw.txt WindowsDefaultHeatProcessor.dll windowsdefenderapplicationguardcsp.dll WindowsInternal.ComposableShell.ComposerFramework.dll WindowsInternal.Co > mposableShell.DesktopHosting.dll WindowsInternal.Shell.CompUiActivation.dll WindowsIoTCsp.dll windowslivelogin.dll WindowsManagementServiceWinRt.ProxyStub.dll windowsperformancerecordercontrol.dll WindowsPowerShell WindowsSecurityIcon.png windowsudk.shellcommon.dll W > indowsUpdateElevatedInstaller.exe winethc.dll winevt WinFax.dll winhttp.dll winhttpcom.dll WinHvEmulation.dll WinHvPlatform.dll wininet.dll wininetlui.dll wininit.exe wininitext.dll winipcfile.dll winipcsecproc.dll winipsec.dll winjson.dll Winlangdb.dll winload.efi w > inload.exe winlogon.exe winlogonext.dll winmde.dll WinMetadata winml.dll winmm.dll winmmbase.dll winmsipc.dll WinMsoIrmProtector.dll winnlsres.dll winnsi.dll WinOpcIrmProtector.dll WinREAgent.dll winresume.efi winresume.exe winrm winrm.cmd winrm.vbs winrnr.dll winrs. > exe winrscmd.dll winrshost.exe winrsmgr.dll winrssrv.dll WinRTNetMUAHostServer.exe WinRtTracing.dll WinSAT.exe WinSATAPI.dll WinSCard.dll WinSetupUI.dll winshfhc.dll winsku.dll winsockhc.dll winspool.drv winsqlite3.dll WINSRPC.DLL winsrv.dll winsrvext.dll winsta.dll > WinSync.dll WinSyncMetastore.dll WinSyncProviders.dll wintrust.dll WinTypes.dll winusb.dll winver.exe WiredNetworkCSP.dll wisp.dll witnesswmiv2provider.dll wkscli.dll wkspbroker.exe wkspbrokerAx.dll wksprt.exe wksprtPS.dll wkssvc.dll wlanapi.dll wlancfg.dll WLanConn. > dll wlandlg.dll wlanext.exe wlangpui.dll WLanHC.dll wlanhlp.dll WlanMediaManager.dll WlanMM.dll wlanmsm.dll wlanpref.dll WlanRadioManager.dll wlansec.dll wlansvc.dll wlansvcpal.dll wlanui.dll wlanutil.dll Wldap32.dll wldp.dll wlgpclnt.dll wlidcli.dll wlidcredprov.dll > wlidfdp.dll wlidnsp.dll wlidprov.dll wlidres.dll wlidsvc.dll wlrmdr.exe WMADMOD.DLL WMADMOE.DLL WMALFXGFXDSP.dll WMASF.DLL wmcodecdspps.dll wmdmlog.dll wmdmps.dll wmdrmsdk.dll wmerror.dll wmi.dll wmiclnt.dll wmicmiplugin.dll wmidcom.dll wmidx.dll WmiMgmt.msc wmiprop > .dll wmitomi.dll WMNetMgr.dll wmp.dll WMPDMC.exe WmpDui.dll wmpdxm.dll wmpeffects.dll WMPhoto.dll wmploc.DLL wmpps.dll wmpshell.dll wmsgapi.dll WMSPDMOD.DLL WMSPDMOE.DLL WMVCORE.DLL WMVDECOD.DLL wmvdspa.dll WMVENCOD.DLL WMVSDECD.DLL WMVSENCD.DLL WMVXENCD.DLL WofTasks > .dll WofUtil.dll WordBreakers.dll WorkFolders.exe WorkfoldersControl.dll WorkFoldersGPExt.dll WorkFoldersRes.dll WorkFoldersShell.dll workfolderssvc.dll wosc.dll wow64.dll wow64cpu.dll wow64win.dll wowreg32.exe WpAXHolder.dll wpbcreds.dll Wpc.dll WpcApi.dll wpcatltoa > st.png WpcDesktopMonSvc.dll WpcMon.exe wpcmon.png WpcProxyStubs.dll WpcRefreshTask.dll WpcTok.exe WpcWebFilter.dll wpdbusenum.dll WpdMtp.dll WpdMtpUS.dll wpdshext.dll WPDShextAutoplay.exe WPDShServiceObj.dll WPDSp.dll wpd_ci.dll wpnapps.dll wpnclient.dll wpncore.dll > wpninprc.dll wpnpinst.exe wpnprv.dll wpnservice.dll wpnsruprov.dll WpnUserService.dll WpPortingLibrary.dll WppRecorderUM.dll wpr.config.xml wpr.exe WPTaskScheduler.dll wpx.dll write.exe ws2help.dll ws2_32.dll wscadminui.exe wscapi.dll wscinterop.dll wscisvif.dll WSCl > ient.dll WSCollect.exe wscproxystub.dll wscript.exe wscsvc.dll wscui.cpl WSDApi.dll wsdchngr.dll WSDPrintProxy.DLL WsdProviderUtil.dll WSDScanProxy.dll wsecedit.dll wsepno.dll wshbth.dll wshcon.dll wshelper.dll wshext.dll wshhyperv.dll wship6.dll wshom.ocx wshqos.dll > wshrm.dll WSHTCPIP.DLL wshunix.dll wsl.exe wslapi.dll WsmAgent.dll wsmanconfig_schema.xml WSManHTTPConfig.exe WSManMigrationPlugin.dll WsmAuto.dll wsmplpxy.dll wsmprovhost.exe WsmPty.xsl WsmRes.dll WsmSvc.dll WsmTxt.xsl WsmWmiPl.dll wsnmp32.dll wsock32.dll wsplib.dl > l wsp_fs.dll wsp_health.dll wsp_sr.dll wsqmcons.exe WSReset.exe WSTPager.ax wtsapi32.dll wuapi.dll wuapihost.exe wuauclt.exe wuaueng.dll wuceffects.dll WUDFCoinstaller.dll WUDFCompanionHost.exe WUDFHost.exe WUDFPlatform.dll WudfSMCClassExt.dll WUDFx.dll WUDFx02000.dl > l wudriver.dll wups.dll wups2.dll wusa.exe wuuhext.dll wuuhosdeployment.dll wvc.dll WwaApi.dll WwaExt.dll WWAHost.exe WWanAPI.dll wwancfg.dll wwanconn.dll WWanHC.dll wwanmm.dll Wwanpref.dll wwanprotdim.dll WwanRadioManager.dll wwansvc.dll wwapi.dll XamlTileRender.dll XAudio2_8.dll XAudio2_9.dll XblAuthManager.dll XblAuthManagerProxy.dll XblAuthTokenBrokerExt.dll XblGameSave.dll XblGameSaveExt.dll XblGameSaveProxy.dll XblGameSaveTask.exe XboxGipRadioManager.dll xboxgipsvc.dll xboxgipsynthetic.dll XboxNetApiSvc.dll xcopy.exe XInput1_4.dll XInput9_1_0.dll XInputUap.dll xmlfilter.dll xmllite.dll xmlprovi.dll xolehlp.dll XpsDocumentTargetPrint.dll XpsGdiConverter.dll XpsPrint.dll xpspushlayer.dll XpsRasterService.dll xpsservices.dll XpsToPclmConverter.dll XpsToPwgrConverter.dll xwizard.dtd xwizard.exe xwizards.dll xwreg.dll xwtpdui.dll xwtpw32.dll X_80.contrast-black.png X_80.contrast-white.png X_80.png ze_loader.dll ze_tracing_layer.dll ze_validation_layer.dll zh-CN zh-TW zipcontainer.dll zipfldr.dll ztrace_maps.dll > /cygdrive/c/Windows: addins AhnInst.log appcompat Application Data apppatch AppReadiness assembly bcastdvr bfsvc.exe BitLockerDiscoveryVolumeContents Boot bootstat.dat Branding CbsTemp Containers CSC Cursors debug diagnostics DiagTrack DigitalLocker Downloaded > Program Files DtcInstall.log ELAMBKUP en-US explorer.exe Fonts GameBarPresenceWriter gethelp_audiotroubleshooter_latestpackage.zip Globalization Help HelpPane.exe hh.exe hipiw.dll IdentityCRL ImageSAFERSvc.exe IME IMGSF50Svc.exe ImmersiveControlPanel INF InputMethod > Installer ko-KR L2Schemas LanguageOverlayCache LiveKernelReports Logs lsasetup.log Media mib.bin Microsoft.NET Migration ModemLogs notepad.exe OCR Offline Web Pages Panther Performance PFRO.log PLA PolicyDefinitions Prefetch PrintDialog Professional.xml Provisioning > regedit.exe Registration RemotePackages rescache Resources RtlExUpd.dll SchCache schemas security ServiceProfiles ServiceState servicing Setup setupact.log setuperr.log ShellComponents ShellExperiences SHELLNEW SKB SoftwareDistribution Speech Speech_OneCore splwow64. > exe System system.ini System32 SystemApps SystemResources SystemTemp SysWOW64 TAPI Tasks Temp TempInst tracing twain_32 twain_32.dll Vss WaaS Web win.ini WindowsShell.Manifest WindowsUpdate.log winhlp32.exe WinSxS WMSysPr9.prx write.exe > /cygdrive/c/Windows/System32/Wbem: aeinv.mof AgentWmi.mof AgentWmiUninstall.mof appbackgroundtask.dll appbackgroundtask.mof appbackgroundtask_uninstall.mof AuditRsop.mof authfwcfg.mof AutoRecover bcd.mof BthMtpEnum.mof cimdmtf.mof cimwin32.dll cimwin32.mof CIWm > i.mof classlog.mof cli.mof cliegaliases.mof ddp.mof dimsjob.mof dimsroam.mof DMWmiBridgeProv.dll DMWmiBridgeProv.mof DMWmiBridgeProv1.dll DMWmiBridgeProv1.mof DMWmiBridgeProv1_Uninstall.mof DMWmiBridgeProv_Uninstall.mof dnsclientcim.dll dnsclientcim.mof dnsclientpspr > ovider.dll dnsclientpsprovider.mof dnsclientpsprovider_Uninstall.mof drvinst.mof DscCore.mof DscCoreConfProv.mof dscproxy.mof Dscpspluginwkr.dll DscTimer.mof dsprov.dll dsprov.mof eaimeapi.mof EmbeddedLockdownWmi.dll embeddedlockdownwmi.mof embeddedlockdownwmi_Uninst > all.mof en en-US esscli.dll EventTracingManagement.dll EventTracingManagement.mof fastprox.dll fdPHost.mof fdrespub.mof fdSSDP.mof fdWNet.mof fdWSD.mof filetrace.mof firewallapi.mof FolderRedirectionWMIProvider.mof FunDisc.mof fwcfg.mof hbaapi.mof hnetcfg.mof IMAPIv2 > -Base.mof IMAPIv2-FileSystemSupport.mof IMAPIv2-LegacyShim.mof interop.mof IpmiDTrc.mof ipmiprr.dll ipmiprv.dll ipmiprv.mof IpmiPTrc.mof ipsecsvc.mof iscsidsc.mof iscsihba.mof iscsiprf.mof iscsirem.mof iscsiwmiv2.mof iscsiwmiv2_uninstall.mof kerberos.mof ko ko-KR Krn > lProv.dll krnlprov.mof L2SecHC.mof lltdio.mof lltdsvc.mof Logs lsasrv.mof mblctr.mof MDMAppProv.dll MDMAppProv.mof MDMAppProv_Uninstall.mof MDMSettingsProv.dll MDMSettingsProv.mof MDMSettingsProv_Uninstall.mof Microsoft-Windows-OfflineFiles.mof Microsoft-Windows-Remo > te-FileSystem.mof Microsoft.AppV.AppVClientWmi.dll Microsoft.AppV.AppVClientWmi.mof Microsoft.Uev.AgentWmi.dll Microsoft.Uev.ManagedAgentWmi.mof Microsoft.Uev.ManagedAgentWmiUninstall.mof mispace.mof mispace_uninstall.mof mmc.mof MMFUtil.dll MOF mofcomp.exe mofd.dll > mofinstall.dll mountmgr.mof mpeval.mof mpsdrv.mof mpssvc.mof msdtcwmi.dll MsDtcWmi.mof msfeeds.mof msfeedsbs.mof msi.mof msiprov.dll msiscsi.mof MsNetImPlatform.mof mstsc.mof mstscax.mof msv1_0.mof mswmdm.mof NCProv.dll ncprov.mof ncsi.mof ndisimplatcim.dll ndistrace > .mof NetAdapterCim.dll NetAdapterCim.mof NetAdapterCimTrace.mof NetAdapterCimTraceUninstall.mof NetAdapterCim_uninstall.mof netdacim.dll netdacim.mof netdacim_uninstall.mof NetEventPacketCapture.dll NetEventPacketCapture.mof NetEventPacketCapture_uninstall.mof netncc > im.dll netnccim.mof netnccim_uninstall.mof NetPeerDistCim.dll NetPeerDistCim.mof NetPeerDistCim_uninstall.mof netprofm.mof NetSwitchTeam.mof netswitchteamcim.dll NetTCPIP.dll NetTCPIP.mof NetTCPIP_Uninstall.mof netttcim.dll netttcim.mof netttcim_uninstall.mof network > itemfactory.mof newdev.mof nlasvc.mof nlmcim.dll nlmcim.mof nlmcim_uninstall.mof nlsvc.mof npivwmi.mof nshipsec.mof ntevt.dll ntevt.mof ntfs.mof OfflineFilesConfigurationWmiProvider.mof OfflineFilesConfigurationWmiProvider_Uninstall.mof OfflineFilesWmiProvider.mof Of > flineFilesWmiProvider_Uninstall.mof p2p-mesh.mof p2p-pnrp.mof pcsvDevice.mof pcsvDevice_Uninstall.mof Performance PNPXAssoc.mof PolicMan.dll PolicMan.mof polproc.mof polprocl.mof polprou.mof polstore.mof portabledeviceapi.mof portabledeviceclassextension.mof portable > deviceconnectapi.mof portabledevicetypes.mof portabledevicewiacompat.mof powermeterprovider.mof PowerPolicyProvider.mof ppcRsopCompSchema.mof ppcRsopUserSchema.mof PrintFilterPipelineSvc.mof PrintManagementProvider.dll PrintManagementProvider.mof PrintManagementProvider_Uninstall.mof profileassociationprovider.mof PS_MMAgent.mof qmgr.mof qoswmi.dll qoswmi.mof qoswmitrc.mof qoswmitrc_uninstall.mof qoswmi_uninstall.mof RacWmiProv.dll RacWmiProv.mof rawxml.xsl rdpendp.mof rdpinit.mof rdpshell.mof refs.mof refsv1.mof regevent.mof Remove.Microsoft.AppV.AppvClientWmi.mof repdrvfs.dll Repository rsop.mof rspndr.mof samsrv.mof scersop.mof schannel.mof schedprov.dll SchedProv.mof scm.mof scrcons.exe scrcons.mof sdbus.mof secrcw32.mof SensorsClassExtension.mof ServDeps.dll ServiceModel.mof ServiceModel.mof.uninstall ServiceModel35.mof ServiceModel35.mof.uninstall services.mof setupapi.mof SmbWitnessWmiv2Provider.mof smbwmiv2.mof SMTPCons.dll smtpcons.mof sppwmi.mof sr.mof sstpsvc.mof stdprov.dll storagewmi.mof storagewmi_passthru.mof storagewmi_passthru_uninstall.mof storagewmi_uninstall.mof stortrace.mof subscrpt.mof system.mof tcpip.mof texttable.xsl textvaluelist.xsl tmf tsallow.mof tscfgwmi.mof tsmf.mof tspkg.mof umb.mof umbus.mof umpass.mof umpnpmgr.mof unsecapp.exe UserProfileConfigurationWmiProvider.mof UserProfileWmiProvider.mof UserStateWMIProvider.mof vds.mof vdswmi.dll viewprov.dll vpnclientpsprovider.dll vpnclientpsprovider.mof vpnclientpsprovider_Uninstall.mof vss.mof vsswmi.dll wbemcntl.dll wbemcons.dll WBEMCons.mof wbemcore.dll wbemdisp.dll wbemdisp.tlb wbemess.dll wbemprox.dll wbemsvc.dll wbemtest.exe wcncsvc.mof WdacEtwProv.mof WdacWmiProv.dll WdacWmiProv.mof WdacWmiProv_Uninstall.mof Wdf01000.mof Wdf01000Uninstall.mof wdigest.mof WFAPIGP.mof wfascim.dll wfascim.mof wfascim_uninstall.mof WFP.MOF wfs.mof whqlprov.mof Win32_DeviceGuard.mof Win32_EncryptableVolume.dll win32_encryptablevolume.mof Win32_EncryptableVolumeUninstall.mof win32_printer.mof Win32_Tpm.dll Win32_Tpm.mof wininit.mof winipsec.mof winlogon.mof WinMgmt.exe WinMgmtR.dll Winsat.mof WinsatUninstall.mof wlan.mof WLanHC.mof wmi.mof WMIADAP.exe WmiApRes.dll WmiApRpl.dll WmiApSrv.exe WMIC.exe WMICOOKR.dll WmiDcPrv.dll wmipcima.dll wmipcima.mof wmipdfs.dll wmipdfs.mof wmipdskq.dll wmipdskq.mof WmiPerfClass.dll WmiPerfClass.mof WmiPerfInst.dll WmiPerfInst.mof WMIPICMP.dll wmipicmp.mof WMIPIPRT.dll wmipiprt.mof WMIPJOBJ.dll wmipjobj.mof wmiprov.dll WmiPrvSD.dll WmiPrvSE.exe WMIPSESS.dll wmipsess.mof WMIsvc.dll wmitimep.dll wmitimep.mof wmiutils.dll WMI_Tracing.mof wmp.mof wmpnetwk.mof wpdbusenum.mof wpdcomp.mof wpdfs.mof wpdmtp.mof wpdshext.mof WPDShServiceObj.mof wpdsp.mof wpd_ci.mof wscenter.mof WsmAgent.mof WsmAgentUninstall.mof WsmAuto.mof wsp_fs.mof wsp_fs_uninstall.mof wsp_health.mof wsp_health_uninstall.mof wsp_sr.mof wsp_sr_uninstall.mof WUDFx.mof Wudfx02000.mof Wudfx02000Uninstall.mof WUDFxUninstall.mof xml xsl-mappings.xml xwizards.mof > /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0: Certificate.format.ps1xml Diagnostics.Format.ps1xml DotNetTypes.format.ps1xml en en-US Event.Format.ps1xml Examples FileSystem.format.ps1xml getevent.types.ps1xml Help.format.ps1xml HelpV3.format.ps1xml ko ko-KR Modules powershell.exe powershell.exe.config PowerShellCore.format.ps1xml PowerShellTrace.format.ps1xml powershell_ise.exe powershell_ise.exe.config PSEvents.dll pspluginwkr.dll pwrshmsg.dll pwrshsip.dll Registry.format.ps1xml Schemas SessionConfig types.ps1xml typesv3.ps1xml WSMan.Format.ps1xml > /cygdrive/c/Windows/System32/OpenSSH: scp.exe sftp.exe ssh-add.exe ssh-agent.exe ssh-keygen.exe ssh-keyscan.exe ssh.exe > /cygdrive/c/Program Files/MATLAB/R2020b/bin: crash_analyzer.cfg icutzdata lcdata.xml lcdata.xsd lcdata_utf8.xml m3iregistry matlab.exe mex.bat mexext.bat util win32 win64 > /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn: Resources SqlLocalDB.exe > /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn: batchparser.dll bcp.exe Resources SQLCMD.EXE xmlrw.dll > /cygdrive/c/Program Files/Git/cmd: git-gui.exe git-lfs.exe git.exe gitk.exe start-ssh-agent.cmd start-ssh-pageant.cmd > Warning accessing /cygdrive/c/msys64/mingw64/bin gives errors: [Errno 2] No such file or directory: '/cygdrive/c/msys64/mingw64/bin' > Warning accessing /cygdrive/c/msys64/usr/bin gives errors: [Errno 2] No such file or directory: '/cygdrive/c/msys64/usr/bin' > /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config llvm-symbolizer.exe LocalESPC.dll Microsoft.Diagnostics.Tracing.EventSource.dll Microsoft.VisualStudio.RemoteControl.dll Microsoft.VisualStudio.Telemetry.dll Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll nmake.exe onecore perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll vctip.exe xdcmake.exe xdcmake.exe.config > /cygdrive/c/Program Files/dotnet: dotnet.exe host LICENSE.txt packs sdk shared templates ThirdPartyNotices.txt > /: bin Cygwin-Terminal.ico Cygwin.bat Cygwin.ico dev etc home lib mpich-4.0.2 mpich-4.0.2.tar.gz sbin tmp usr var proc cygdrive > /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps: Backup GameBarElevatedFT_Alias.exe Microsoft.DesktopAppInstaller_8wekyb3d8bbwe Microsoft.MicrosoftEdge_8wekyb3d8bbwe Microsoft.SkypeApp_kzf8qxf38zg5c Microsoft.XboxGamingOverlay_8wekyb3d8bbwe MicrosoftEdge.exe python.exe python3.exe Skype.exe winget.exe > /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin: code code.cmd > /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config llvm-symbolizer.exe LocalESPC.dll Microsoft.Diagnostics.Tracing.EventSource.dll Microsoft.VisualStudio.RemoteControl.dll Microsoft.VisualStudio.Telemetry.dll Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll nmake.exe onecore perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll vctip.exe xdcmake.exe xdcmake.exe.config > Warning accessing /cygdrive/c/Users/SEJONG/.dotnet/tools gives errors: [Errno 2] No such file or directory: '/cygdrive/c/Users/SEJONG/.dotnet/tools' > /usr/lib/lapack: cygblas-0.dll cyglapack-0.dll > ============================================================================================= > TESTING: configureExternalPackagesDir from config.framework(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py:1045) > Set alternative directory external packages are built in > serialEvaluation: initial cxxDialectRanges ('c++11', 'c++17') > serialEvaluation: new cxxDialectRanges ('c++11', 'c++17') > child config.utilities.macosFirewall took 0.000005 seconds > ============================================================================================= > TESTING: configureDebuggers from config.utilities.debuggers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/utilities/debuggers.py:20) > Find a default debugger and determine its arguments > Checking for program /usr/local/bin/gdb...not found > Checking for program /usr/bin/gdb...not found > Checking for program /cygdrive/c/SIMULIA/Commands/gdb...not found > Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/gdb...not found > Checking for program /cygdrive/c/Windows/system32/gdb...not found > Checking for program /cygdrive/c/Windows/gdb...not found > Checking for program /cygdrive/c/Windows/System32/Wbem/gdb...not found > Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/gdb...not found > Checking for program /cygdrive/c/Windows/System32/OpenSSH/gdb...not found > Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/gdb...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/gdb...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/gdb...not found > Checking for program /cygdrive/c/Program Files/Git/cmd/gdb...not found > Checking for program /cygdrive/c/msys64/mingw64/bin/gdb...not found > Checking for program /cygdrive/c/msys64/usr/bin/gdb...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not found > Checking for program /cygdrive/c/Program Files/dotnet/gdb...not found > Checking for program /gdb...not found > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/gdb...not found > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/gdb...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not found > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/gdb...not found > Checking for program /usr/lib/lapack/gdb...not found > Checking for program /usr/local/bin/dbx...not found > Checking for program /usr/bin/dbx...not found > Checking for program /cygdrive/c/SIMULIA/Commands/dbx...not found > Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/dbx...not found > Checking for program /cygdrive/c/Windows/system32/dbx...not found > Checking for program /cygdrive/c/Windows/dbx...not found > Checking for program /cygdrive/c/Windows/System32/Wbem/dbx...not found > Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/dbx...not found > Checking for program /cygdrive/c/Windows/System32/OpenSSH/dbx...not found > Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/dbx...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/dbx...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/dbx...not found > Checking for program /cygdrive/c/Program Files/Git/cmd/dbx...not found > Checking for program /cygdrive/c/msys64/mingw64/bin/dbx...not found > Checking for program /cygdrive/c/msys64/usr/bin/dbx...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not found > Checking for program /cygdrive/c/Program Files/dotnet/dbx...not found > Checking for program /dbx...not found > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/dbx...not found > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/dbx...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not found > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/dbx...not found > Checking for program /usr/lib/lapack/dbx...not found > Defined make macro "DSYMUTIL" to "true" > child config.utilities.debuggers took 0.014310 seconds > ============================================================================================= > TESTING: configureDirectories from PETSc.options.petscdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscdir.py:22) > Checks PETSC_DIR and sets if not set > PETSC_VERSION_RELEASE of 1 indicates the code is from a release branch or a branch created from a release branch. > Version Information: > #define PETSC_VERSION_RELEASE 1 > #define PETSC_VERSION_MAJOR 3 > #define PETSC_VERSION_MINOR 18 > #define PETSC_VERSION_SUBMINOR 1 > #define PETSC_VERSION_DATE "Oct 26, 2022" > #define PETSC_VERSION_GIT "v3.18.1" > #define PETSC_VERSION_DATE_GIT "2022-10-26 07:57:29 -0500" > #define PETSC_VERSION_EQ(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_ PETSC_VERSION_EQ > #define PETSC_VERSION_LT(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_LE(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_GT(MAJOR,MINOR,SUBMINOR) \ > #define PETSC_VERSION_GE(MAJOR,MINOR,SUBMINOR) \ > child PETSc.options.petscdir took 0.015510 seconds > ============================================================================================= > TESTING: getDatafilespath from PETSc.options.dataFilesPath(/home/SEJONG/petsc-3.18.1/config/PETSc/options/dataFilesPath.py:29) > Checks what DATAFILESPATH should be > child PETSc.options.dataFilesPath took 0.002462 seconds > ============================================================================================= > TESTING: configureGit from config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:24) > Find the Git executable > Checking for program /usr/local/bin/git...not found > Checking for program /usr/bin/git...found > Defined make macro "GIT" to "git" > Executing: git --version > stdout: git version 2.38.1 > ============================================================================================= > TESTING: configureMercurial from config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:35) > Find the Mercurial executable > Checking for program /usr/local/bin/hg...not found > Checking for program /usr/bin/hg...not found > Checking for program /cygdrive/c/SIMULIA/Commands/hg...not found > Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/hg...not found > Checking for program /cygdrive/c/Windows/system32/hg...not found > Checking for program /cygdrive/c/Windows/hg...not found > Checking for program /cygdrive/c/Windows/System32/Wbem/hg...not found > Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/hg...not found > Checking for program /cygdrive/c/Windows/System32/OpenSSH/hg...not found > Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/hg...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/hg...not found > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/hg...not found > Checking for program /cygdrive/c/Program Files/Git/cmd/hg...not found > Checking for program /cygdrive/c/msys64/mingw64/bin/hg...not found > Checking for program /cygdrive/c/msys64/usr/bin/hg...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not found > Checking for program /cygdrive/c/Program Files/dotnet/hg...not found > Checking for program /hg...not found > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/hg...not found > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/hg...not found > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not found > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/hg...not found > Checking for program /usr/lib/lapack/hg...not found > Checking for program /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/hg...not found > child config.sourceControl took 0.121914 seconds > ============================================================================================= > TESTING: configureInstallationMethod from PETSc.options.petscclone(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscclone.py:20) > Determine if PETSc was obtained via git or a tarball > This is a tarball installation > child PETSc.options.petscclone took 0.003125 seconds > ============================================================================================= > TESTING: setNativeArchitecture from PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:29) > Forms the arch as GNU's configure would form it > ============================================================================================= > TESTING: configureArchitecture from PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:42) > Checks if PETSC_ARCH is set and sets it if not set > No previous hashfile found > Setting hashfile: arch-mswin-c-debug/lib/petsc/conf/configure-hash > Deleting configure hash file: arch-mswin-c-debug/lib/petsc/conf/configure-hash > Unable to delete configure hash file: arch-mswin-c-debug/lib/petsc/conf/configure-hash > child PETSc.options.arch took 0.149094 seconds > ============================================================================================= > TESTING: setInstallDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:31) > Set installDir to either prefix or if that is not set to PETSC_DIR/PETSC_ARCH > Defined make macro "PREFIXDIR" to "/home/SEJONG/petsc-3.18.1/arch-mswin-c-debug" > ============================================================================================= > TESTING: saveReconfigure from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:76) > Save the configure options in a script in PETSC_ARCH/lib/petsc/conf so the same configure may be easily re-run > ============================================================================================= > TESTING: cleanConfDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:68) > Remove all the files from configuration directory for this PETSC_ARCH, from --with-clean option > ============================================================================================= > TESTING: configureInstallDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:52) > Makes installDir subdirectories if it does not exist for both prefix install location and PETSc work install location > Changed persistence directory to /home/SEJONG/petsc-3.18.1/arch-mswin-c-debug/lib/petsc/conf > > TESTING: restoreReconfigure from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:90) > If --with-clean was requested but restoring the reconfigure file was requested then restore it > child PETSc.options.installDir took 0.006476 seconds > ============================================================================================= > TESTING: setExternalPackagesDir from PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:15) > Set location where external packages will be downloaded to > ============================================================================================= > TESTING: cleanExternalpackagesDir from PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:23) > Remove all downloaded external packages, from --with-clean > child PETSc.options.externalpackagesdir took 0.000990 seconds > ============================================================================================= > TESTING: configureCLanguage from PETSc.options.languages(/home/SEJONG/petsc-3.18.1/config/PETSc/options/languages.py:28) > Choose whether to compile the PETSc library using a C or C++ compiler > C language is C > Defined "CLANGUAGE_C" to "1" > Defined make macro "CLANGUAGE" to "C" > child PETSc.options.languages took 0.003172 seconds > ============================================================================================= > TESTING: resetEnvCompilers from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2652) > Remove compilers from the shell environment so they do not interfer with testing > ============================================================================================= > TESTING: checkEnvCompilers from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2669) > Set configure compilers from the environment, from -with-environment-variables > ============================================================================================= > TESTING: checkMPICompilerOverride from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2622) > Check if --with-mpi-dir is used along with CC CXX or FC compiler options. > This usually prevents mpi compilers from being used - so issue a warning > ============================================================================================= > TESTING: requireMpiLdPath from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2643) > OpenMPI wrappers require LD_LIBRARY_PATH set > ============================================================================================= > TESTING: checkInitialFlags from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:723) > Initialize the compiler and linker flags > Initialized CFLAGS to > Initialized CFLAGS to > Initialized LDFLAGS to > Initialized CUDAFLAGS to > Initialized CUDAFLAGS to > Initialized LDFLAGS to > Initialized HIPFLAGS to > Initialized HIPFLAGS to > Initialized LDFLAGS to > Initialized SYCLFLAGS to > Initialized SYCLFLAGS to > Initialized LDFLAGS to > Initialized CXXFLAGS to > Initialized CXX_CXXFLAGS to > Initialized LDFLAGS to > Initialized FFLAGS to > Initialized FFLAGS to > Initialized LDFLAGS to > Initialized CPPFLAGS to > Initialized FPPFLAGS to > Initialized CUDAPPFLAGS to -Wno-deprecated-gpu-targets > Initialized CXXPPFLAGS to > Initialized HIPPPFLAGS to > Initialized SYCLPPFLAGS to > Initialized CC_LINKER_FLAGS to [] > Initialized CXX_LINKER_FLAGS to [] > Initialized FC_LINKER_FLAGS to [] > Initialized CUDAC_LINKER_FLAGS to [] > Initialized HIPC_LINKER_FLAGS to [] > Initialized SYCLC_LINKER_FLAGS to [] > > TESTING: checkCCompiler from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:1341) > Locate a functional C compiler > Checking for program /usr/local/bin/mpicc...not found > Checking for program /usr/bin/mpicc...found > Defined make macro "CC" to "mpicc" > Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o -I/tmp/petsc-uqt11yqc/config.setCompilers /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c > Successful compile: > Source: > #include "confdefs.h" > #include "conffix.h" > > int main() { > ; > return 0; > } > > Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o -I/tmp/petsc-uqt11yqc/config.setCompilers /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c > Successful compile: > Source: > #include "confdefs.h" > #include "conffix.h" > > int main() { > ; > return 0; > } > > Executing: mpicc -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.exe /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o > Possible ERROR while running linker: exit code 1 > stderr: > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory > collect2: error: ld returned 1 exit status > Linker output before filtering: > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory > collect2: error: ld returned 1 exit status > : > Linker output after filtering: > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory > collect2: error: ld returned 1 exit status: > Error testing C compiler: Cannot compile/link C with mpicc. > MPI compiler wrapper mpicc failed to compile > Executing: mpicc -show > stdout: gcc -L/usr/lib -lmpi -lopen-rte -lopen-pal -lhwloc -levent_core -levent_pthreads -lz > MPI compiler wrapper mpicc is likely incorrect. > Use --with-mpi-dir to indicate an alternate MPI. > Deleting "CC" > ******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > ------------------------------------------------------------------------------- > C compiler you provided with -with-cc=mpicc cannot be found or does not work. > Cannot compile/link C with mpicc. > ******************************************************************************* > File "/home/SEJONG/petsc-3.18.1/config/configure.py", line 461, in petsc_configure > framework.configure(out = sys.stdout) > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1412, in configure > self.processChildren() > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1400, in processChildren > self.serialEvaluation(self.childGraph) > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1375, in serialEvaluation > child.configure() > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 2712, in configure > self.executeTest(self.checkCCompiler) > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/base.py", line 138, in executeTest > ret = test(*args,**kargs) > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 1346, in checkCCompiler > for compiler in self.generateCCompilerGuesses(): > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 1274, in generateCCompilerGuesses > raise RuntimeError('C compiler you provided with -with-cc='+self.argDB['with-cc']+' cannot be found or does not work.'+'\n'+self.mesg) > ================================================================================ > Finishing configure run at Tue, 01 Nov 2022 13:06:09 +0900 > > -----Original Message----- > From: Satish Balay > Sent: Tuesday, November 1, 2022 11:36 AM > To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: RE: [petsc-users] PETSc Windows Installation > > you'll have to send configure.log for this failure > > Satish > > > On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > > > I have checked the required Cygwin openmpi libraries and they are all installed. When I run ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90, it returns: > > > > $ ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > > ============================================================================================= > > Configuring PETSc to compile on your system > > ====================================================================== > > ======================= > > TESTING: checkCCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > > ---------------------------------------------------------------------- > > --------- C compiler you provided with -with-cc=mpicc cannot be found > > or does not work. > > Cannot compile/link C with mpicc. > > > > As for the case of WSL2, I will try to install that on my PC. > > Meanwhile, could you please look into this issue > > > > Thank you > > > > Ali > > > > -----Original Message----- > > From: Satish Balay > > Sent: Monday, October 31, 2022 10:56 PM > > To: Satish Balay via petsc-users > > Cc: Matthew Knepley ; Mohammad Ali Yaqteen > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > BTW: If you have WSL2 on windows - it might be easier to build/use PETSc. > > > > Satish > > > > On Mon, 31 Oct 2022, Satish Balay via petsc-users wrote: > > > > > Make sure you have cygwin openmpi installed [and cywin blas/lapack] > > > > > > $ cygcheck -cd |grep openmpi > > > libopenmpi-devel 4.1.2-1 > > > libopenmpi40 4.1.2-1 > > > libopenmpifh40 4.1.2-1 > > > libopenmpiusef08_40 4.1.2-1 > > > libopenmpiusetkr40 4.1.2-1 > > > openmpi 4.1.2-1 > > > $ cygcheck -cd |grep lapack > > > liblapack-devel 3.10.1-1 > > > liblapack0 3.10.1-1 > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > --download-f2cblaslapack > > > > > > Should be: > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > i.e [default] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 [an > > > default cygwin blas/lapack] > > > > > > Satish > > > > > > > > > On Mon, 31 Oct 2022, Matthew Knepley wrote: > > > > > > > On Mon, Oct 31, 2022 at 1:56 AM Mohammad Ali Yaqteen > > > > > > > > wrote: > > > > > > > > > Dear Satish > > > > > > > > > > When I configure PETSc with (./configure --with-cc=gcc > > > > > --with-cxx=0 > > > > > --with-fc=0 --download-f2cblaslapack) it runs as I shared > > > > > initially which you said is not an issue anymore. But when I add > > > > > (--download-scalapack > > > > > --download-mumps) or configure with these later, it gives the > > > > > following > > > > > error: > > > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > > > > > ============================================================================================= > > > > > Configuring PETSc to compile on your > > > > > system > > > > > > > > > > ================================================================ > > > > > == > > > > > =========================== > > > > > TESTING: FortranMPICheck from > > > > > config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* > > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > > > > details): > > > > > > > > > > ---------------------------------------------------------------- > > > > > -- > > > > > ------------- Fortran error! mpi_init() could not be located! > > > > > > > > > > **************************************************************** > > > > > ** > > > > > ************* > > > > > > > > > > What could be the problem here? > > > > > > > > > > > > > Without configure.log we cannot tell what went wrong. However, > > > > from the error message, I would guess that your MPI was not built > > > > with Fortran bindings. You need these for those packages. > > > > > > > > Thanks, > > > > > > > > Matt > > > > > > > > > > > > > Your help is highly appreciated. > > > > > > > > > > Thank you > > > > > Ali > > > > > > > > > > -----Original Message----- > > > > > From: Satish Balay > > > > > Sent: Saturday, October 29, 2022 2:11 PM > > > > > To: Mohammad Ali Yaqteen > > > > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > > > > > > > > > > > I haven?t accessed PETSC or given any command of my own. I was > > > > > > just > > > > > installing by following the instructions. I don?t know why it is > > > > > attaching the debugger. Although it says ?Possible error running > > > > > C/C++ > > > > > src/snes/tutorials/ex19 with 1 MPI process? which I think is > > > > > indicating of missing of MPI! > > > > > > > > > > The diff is not smart enough to detect the extra message from > > > > > cygwin/OpenMPI - hence it assumes there is a potential problem - > > > > > and prints the above message. > > > > > > > > > > But you can assume its installed properly - and use it. > > > > > > > > > > Satish > > > > > > > > > > > > From: Matthew Knepley > > > > > > Sent: Friday, October 28, 2022 10:31 PM > > > > > > To: Mohammad Ali Yaqteen > > > > > > Cc: petsc-users at mcs.anl.gov > > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen < > > > > > mhyaqteen at sju.ac.kr> wrote: > > > > > > Dear Sir, > > > > > > > > > > > > During the Installation of PETSc in windows, I installed > > > > > > Cygwin and the > > > > > required libraries as mentioned on your website: > > > > > > [cid:image001.png at 01D8EB93.7C17E410] > > > > > > However, when I install PETSc using the configure commands > > > > > > present on > > > > > the petsc website: > > > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > > --download-f2cblaslapack --download-mpich > > > > > > > > > > > > it gives me the following error: > > > > > > > > > > > > [cid:image002.png at 01D8EB93.7C17E410] > > > > > > > > > > > > I already installed OpenMPI using Cygwin installer but it > > > > > > still asks me > > > > > to. When I configure without ??download-mpich? and run ?make check? > > > > > command, it gives me the following errors: > > > > > > > > > > > > [cid:image003.png at 01D8EB93.7C17E410] > > > > > > > > > > > > Could you kindly look into this and help me with this? Your > > > > > > prompt > > > > > response will highly be appreciated. > > > > > > > > > > > > The runs look fine. > > > > > > > > > > > > The test should not try to attach the debugger. Do you have > > > > > > that in the > > > > > PETSC_OPTIONS env variable? > > > > > > > > > > > > Thanks, > > > > > > > > > > > > Matt > > > > > > > > > > > > Thank you! > > > > > > Mohammad Ali > > > > > > Researcher, Sejong University > > > > > > > > > > > > > > > > > > -- > > > > > > What most experimenters take for granted before they begin > > > > > > their > > > > > experiments is infinitely more interesting than any results to > > > > > which their experiments lead. > > > > > > -- Norbert Wiener > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/< > > > > > http://www.cse.buffalo.edu/~knepley/> > > > > > > > > > > > > > > > > > > > > > > > > > > > > > From balay at mcs.anl.gov Tue Nov 1 10:12:52 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 1 Nov 2022 10:12:52 -0500 (CDT) Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2db12320-25ab-7911-4bb6-ff0195f5ffdc@mcs.anl.gov> <461d2b54-173d-95fa-6ad5-9ce81849871e@mcs.anl.gov> <8c7b16a0-f933-92fe-f54a-337bcd88455a@mcs.anl.gov> Message-ID: If you need to use PETSc from Visual Studio - you need to follow instructions at https://petsc.org/release/install/windows/#installation-with-microsoft-intel-windows-compilers [i.e install with MS compilers/MPI - not cygwin compilers/MPI] Also check "Project Files" section on how to setup compiler env for visual studio. Note: Most external packages won't work with MS compilers. Satish On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > The above commands worked but I get an error message when I include petsc.h in Visual Studio. The error message is "Cannot open include file: 'petscconf.h': No such file or directory > > Thanks, > Ali > -----Original Message----- > From: Satish Balay > Sent: Tuesday, November 1, 2022 2:40 PM > To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: Re: [petsc-users] PETSc Windows Installation > > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory > > For some reason cygwin has broken dependencies here. Run cygwin setup and install the following pkgs. > > $ cygcheck.exe -f /usr/lib/libhwloc.dll.a /usr/lib/libevent_core.dll.a /usr/lib/libevent_pthreads.dll.a /usr/lib/libz.dll.a > libevent-devel-2.1.12-1 > libevent-devel-2.1.12-1 > libhwloc-devel-2.6.0-2 > zlib-devel-1.2.12-1 > > BTW: you can attach the file from PETSC_DIR/PETSC_ARCH/lib/petsc/conf/configure.log > > Satish > > On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > > > I am unable to attach the configure.log file. Hence. I have copied the following text after executing the command (less configure.log) in the cygwin64 > > > > Executing: uname -s > > stdout: CYGWIN_NT-10.0-19044 > > ============================================================================================= > > Configuring PETSc to compile on your system > > ============================================================================================= > > > > ================================================================================ > > ================================================================================ > > Starting configure run at Tue, 01 Nov 2022 13:06:06 +0900 > > Configure Options: --configModules=PETSc.Configure --optionsModule=config.compilerOptions --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > > Working directory: /home/SEJONG/petsc-3.18.1 > > Machine platform: > > uname_result(system='CYGWIN_NT-10.0-19044', node='DESKTOP-R1C768B', release='3.3.6-341.x86_64', version='2022-09-05 11:15 UTC', machine='x86_64') > > Python version: > > 3.9.10 (main, Jan 20 2022, 21:37:52) > > [GCC 11.2.0] > > ================================================================================ > > Environmental variables > > USERDOMAIN=DESKTOP-R1C768B > > OS=Windows_NT > > COMMONPROGRAMFILES=C:\Program Files\Common Files > > PROCESSOR_LEVEL=6 > > PSModulePath=C:\Users\SEJONG\Documents\WindowsPowerShell\Modules;C:\Program Files\WindowsPowerShell\Modules;C:\Windows\system32\WindowsPowerShell\v1.0\Modules > > CommonProgramW6432=C:\Program Files\Common Files > > CommonProgramFiles(x86)=C:\Program Files (x86)\Common Files > > LANG=en_US.UTF-8 > > TZ=Asia/Seoul > > HOSTNAME=DESKTOP-R1C768B > > PUBLIC=C:\Users\Public > > OLDPWD=/home/SEJONG > > USERNAME=SEJONG > > LOGONSERVER=\\DESKTOP-R1C768B > > PROCESSOR_ARCHITECTURE=AMD64 > > LOCALAPPDATA=C:\Users\SEJONG\AppData\Local > > COMPUTERNAME=DESKTOP-R1C768B > > USER=SEJONG > > !::=::\ > > SYSTEMDRIVE=C: > > USERPROFILE=C:\Users\SEJONG > > PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.CPL > > SYSTEMROOT=C:\Windows > > USERDOMAIN_ROAMINGPROFILE=DESKTOP-R1C768B > > OneDriveCommercial=C:\Users\SEJONG\OneDrive - Sejong University > > PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 165 Stepping 5, GenuineIntel > > GNUPLOT_LIB=C:\Program Files\gnuplot\demo;C:\Program Files\gnuplot\demo\games;C:\Program Files\gnuplot\share > > PWD=/home/SEJONG/petsc-3.18.1 > > MSMPI_BIN=C:\Program Files\Microsoft MPI\Bin\ > > HOME=/home/SEJONG > > TMP=/tmp > > OneDrive=C:\Users\SEJONG\OneDrive - Sejong University > > ZES_ENABLE_SYSMAN=1 > > !C:=C:\cygwin64\bin > > PROCESSOR_REVISION=a505 > > PROFILEREAD=true > > PROMPT=$P$G > > NUMBER_OF_PROCESSORS=16 > > ProgramW6432=C:\Program Files > > COMSPEC=C:\Windows\system32\cmd.exe > > APPDATA=C:\Users\SEJONG\AppData\Roaming > > SHELL=/bin/bash > > TERM=xterm-256color > > WINDIR=C:\Windows > > ProgramData=C:\ProgramData > > SHLVL=1 > > PRINTER=\\210.107.220.119\HP Color LaserJet Pro MFP M377 PCL 6 > > PROGRAMFILES=C:\Program Files > > ALLUSERSPROFILE=C:\ProgramData > > TEMP=/tmp > > DriverData=C:\Windows\System32\Drivers\DriverData > > SESSIONNAME=Console > > ProgramFiles(x86)=C:\Program Files (x86) > > PATH=/usr/local/bin:/usr/bin:/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program Files/Microsoft MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools:/usr/ lib/lapa ck > > PS1=\[\e]0;\w\a\]\n\[\e[32m\]\u@\h \[\e[33m\]\w\[\e[0m\]\n\$ > > HOMEDRIVE=C: > > INFOPATH=/usr/local/info:/usr/share/info:/usr/info > > HOMEPATH=\Users\SEJONG > > ORIGINAL_PATH=/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program Files/Microsoft MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools > > EXECIGNORE=*.dll > > _=./configure > > Files in path provided by default path > > /usr/local/bin: > > /usr/bin: addftinfo.exe addr2line.exe apropos ar.exe arch.exe as.exe ash.exe awk b2sum.exe base32.exe base64.exe basename.exe basenc.exe bash.exe bashbug bomtool.exe bunzip2.exe bzcat.exe bzcmp bzdiff bzegrep bzfgrep bzgrep bzip2.exe bzip2recover.exe bzless bzmore c++.exe c++filt.exe c89 c99 ca-legacy cal.exe captoinfo cat.exe catman.exe cc ccmake.exe chattr.exe chcon.exe chgrp.exe chmod.exe chown.exe chroot.exe chrt.exe cksum.exe clear.exe cmake.exe cmp.exe col.exe colcrt.exe colrm.exe column.exe comm.exe cp.exe cpack.exe cpp.exe csplit.exe ctest.exe cut.exe cygarchive-13.dll cygargp-0.dll cygatomic-1.dll cygattr-1.dll cygblkid-1.dll cygbrotlicommon-1.dll cygbrotlidec-1.dll cygbz2-1.dll cygcheck.exe cygcom_err-2.dll cygcrypt-2.dll cygcrypto-1.1.dll cygcurl-4.dll cygdb-5.3.dll cygdb_cxx-5.3.dll cygdb_sql-5.3.dll cygedit-0.dll cygevent-2-1-7.dll cygevent_core-2-1-7.dll cygevent_extra-2-1-7.dll cygevent_openssl-2-1-7.dll cygevent_pthreads-2-1-7.dll cygexpat-1.dll cygf disk-1.d ll cygffi-6.dll cygfido2-1..dll cygformw-10.dll cyggc-1.dll cyggcc_s-seh-1.dll cyggdbm-6.dll cyggdbm_compat-4.dll cyggfortran-5.dll cyggmp-10.dll cyggomp-1.dll cyggsasl-7.dll cyggssapi_krb5-2.dll cygguile-2.2-1.dll cyghistory7.dll cyghwloc-15.dll cygiconv-2.dll cygidn-12.dll cygidn2-0.dll cygintl-8.dll cygisl-23.dll cygjsoncpp-25.dll cygk5crypto-3.dll cygkrb5-3.dll cygkrb5support-0.dll cyglber-2-4-2.dll cyglber-2.dll cygldap-2-4-2.dll cygldap-2.dll cygldap_r-2-4-2.dll cygltdl-7.dll cyglz4-1.dll cyglzma-5.dll cyglzo2-2.dll cygmagic-1.dll cygman-2-11-0.dll cygmandb-2-11-0.dll cygmenuw-10.dll cygmpc-3.dll cygmpfr-6.dll cygmpi-40.dll cygmpi_mpifh-40.dll cygmpi_usempif08-40.dll cygmpi_usempi_ignore_tkr-40.dll cygncursesw-10.dll cygnghttp2-14.dll cygntlm-0.dll cygopen-pal-40.dll cygopen-rte-40.dll cygp11-kit-0.dll cygpanelw-10.dll cygpath.exe cygpcre2-8-0.dll cygperl5_32.dll cygpipeline-1.dll cygpkgconf-4.dll cygpopt-0.dll cygpsl-5.dll cygquadmath-0.dll cygreadline7.dll cygrhash-0 .dll cyg runsrv.exe cygsasl2-3.dll cygserver-config cygsigsegv-2.dll cygsmartcols-1.dll cygsqlite3-0.dll cygssh2-1.dll cygssl-1.1.dll cygstart.exe cygstdc++-6.dll cygtasn1-6.dll cygticw-10.dll cygunistring-2.dll cyguuid-1.dll cyguv-1.dll cygwin-console-helper.exe cygwin1.dll cygxml2-2.dll cygxxhash-0.dll cygz.dll cygzstd-1.dll dash.exe date.exe dd.exe df.exe diff.exe diff3.exe dir.exe dircolors.exe dirname.exe dlltool.exe dllwrap.exe dnsdomainname domainname du.exe dumper.exe echo.exe editrights.exe egrep elfedit.exe env.exe eqn.exe eqn2graph ex expand.exe expr.exe f95 factor.exe false.exe fgrep fido2-assert.exe fido2-cred.exe fido2-token.exe file.exe find.exe flock.exe fmt.exe fold.exe g++.exe gawk-5.1.1.exe gawk.exe gcc-ar.exe gcc-nm.exe gcc-ranlib.exe gcc.exe gcov-dump.exe gcov-tool.exe gcov.exe gdiffmk gencat.exe getconf.exe getent.exe getfacl.exe getopt.exe gfortran.exe git-receive-pack.exe git-shell.exe git-upload-archive.exe git-upload-pack.exe git.exe gkill.exe gmondump.exe g prof.exe grap2graph grep.exe grn.exe grodvi.exe groff.exe grolbp.exe grolj4.exe grops.exe grotty.exe groups.exe gunzip gzexe gzip.exe head.exe hexdump.exe hostid.exe hostname.exe hpftodit.exe i686-w64-mingw32-pkg-config id.exe indxbib.exe info.exe infocmp.exe infotocap install-info.exe install.exe ipcmk.exe ipcrm.exe ipcs.exe isosize.exe join.exe kill.exe lastlog.exe ld.bfd.exe ld.exe ldd.exe ldh.exe less.exe lessecho.exe lesskey.exe lexgrog.exe libpython3.9.dll link-cygin.exe lkbib.exe ln.exe locale.exe locate.exe logger.exe login.exe logname.exe look.exe lookbib.exe ls.exe lsattr.exe lto-dump.exe lzcat lzcmp lzdiff lzegrep lzfgrep lzgrep lzless lzma lzmadec.exe lzmainfo.exe lzmore make-dummy-cert make.exe man-recode.exe man.exe mandb.exe manpath.exe mcookie.exe md5sum.exe minidumper.exe mintheme mintty.exe mkdir.exe mkfifo.exe mkgroup.exe mknod.exe mkpasswd.exe mkshortcut.exe mktemp.exe more.exe mount.exe mpic++ mpicc mpicxx mpiexec mpif77 mpif90 mpifort mpirun mv.exe namei.exe ne qn nice. exe nl.exe nm.exe nohup.exe nproc.exe nroff numfmt.exe objcopy.exe objdump.exe od.exe ompi-clean ompi-server ompi_info.exe opal_wrapper.exe openssl.exe orte-clean.exe orte-info.exe orte-server.exe ortecc orted.exe orterun.exe p11-kit.exe passwd.exe paste.exe pathchk.exe pdfroff peflags.exe peflagsall perl.exe perl5.32.1.exe pfbtops.exe pg.exe pic.exe pic2graph pinky.exe pip3 pip3.9 pkg-config pkgconf.exe pldd.exe post-grohtml.exe pr.exe pre-grohtml.exe preconv.exe printenv.exe printf.exe profiler.exe ps.exe ptx.exe pwd.exe pydoc3 pydoc3.9 python python3 python3.9.exe pzstd.exe ranlib.exe readelf.exe readlink.exe readshortcut.exe realpath.exe rebase-trigger rebase.exe rebaseall rebaselst refer.exe regtool.exe rename.exe renew-dummy-cert renice.exe reset rev.exe rm.exe rmdir.exe rsync-ssl rsync.exe run.exe runcon.exe rvi rview scalar.exe scp.exe script.exe scriptreplay.exe sdiff.exe sed.exe seq.exe setfacl.exe setmetamode.exe setsid.exe sftp.exe sh.exe sha1sum.exe sha224sum.ex e sha256 sum.exe sha384sum.exe sha512sum.exe shred.exe shuf.exe size.exe sleep.exe slogin soelim.exe sort.exe split.exe ssh-add.exe ssh-agent.exe ssh-copy-id ssh-host-config ssh-keygen.exe ssh-keyscan.exe ssh-user-config ssh.exe ssp.exe stat.exe stdbuf.exe strace.exe strings.exe strip.exe stty.exe sum.exe sync.exe tabs.exe tac.exe tail.exe tar.exe taskset.exe tbl.exe tee.exe test.exe tfmtodit.exe tic.exe timeout.exe toe.exe touch.exe tput.exe tr.exe troff.exe true.exe truncate.exe trust.exe tset.exe tsort.exe tty.exe tzselect tzset.exe ul.exe umount.exe uname.exe unexpand.exe uniq.exe unlink.exe unlzma unxz unzstd update-ca-trust update-crypto-policies updatedb users.exe uuidgen.exe uuidparse.exe vdir.exe vi.exe view wc.exe whatis.exe whereis.exe which.exe who.exe whoami.exe windmc.exe windres.exe x86_64-pc-cygwin-c++.exe x86_64-pc-cygwin-g++.exe x86_64-pc-cygwin-gcc-11.exe x86_64-pc-cygwin-gcc-ar.exe x86_64-pc-cygwin-gcc-nm.exe x86_64-pc-cygwin-gcc-ranlib.exe x86_64-pc-cygwin-gcc.ex e x86_64 -pc-cygwin-gfortran.exe x86_64-pc-cygwin-pkg-config x86_64-w64-mingw32-pkg-config xargs.exe xmlcatalog.exe xmllint.exe xz.exe xzcat xzcmp xzdec.exe xzdiff xzegrep xzfgrep xzgrep xzless xzmore yes.exe zcat zcmp zdiff zdump.exe zegrep zfgrep zforce zgrep zless zmore znew zstd.exe zstdcat zstdgrep zstdless zstdmt [.exe > > /cygdrive/c/SIMULIA/Commands: abaqus.bat abq2018.bat abq_cae_open.bat abq_odb_open.bat > > /cygdrive/c/Program Files/Microsoft MPI/Bin: mpiexec.exe mpitrace.man smpd.exe > > provthrd.dll provtool.exe ProximityCommon.dll ProximityCommonPal.dll ProximityRtapiPal.dll ProximityService.dll ProximityServicePal.dll ProximityToast ProximityUxHost.exe prproc.exe prvdmofcomp.dll psapi.dll pscript.sep PSHED.DLL psisdecd.dll psisrndr.ax PSModuleDis > > coveryProvider.dll psmodulediscoveryprovider.mof PsmServiceExtHost.dll psmsrv.dll psr.exe pstask.dll pstorec.dll pt-BR pt-PT ptpprov.dll puiapi.dll puiobj.dll PushToInstall.dll pwlauncher.dll pwlauncher.exe pwrshplugin.dll pwsso.dll qappsrv.exe qasf.dll qcap.dll qdv. > > dll qdvd.dll qedit.dll qedwipes.dll qmgr.dll qprocess.exe QualityUpdateAssistant.dll quartz.dll Query.dll query.exe QuickActionsDataModel.dll quickassist.exe QuietHours.dll quser.exe qwave.dll qwinsta.exe RacEngn.dll racpldlg.dll radardt.dll radarrs.dll RADCUI.dll ra > > s rasadhlp.dll rasapi32.dll rasauto.dll rasautou.exe raschap.dll raschapext.dll rasctrnm.h rasctrs.dll rascustom.dll rasdiag.dll rasdial.exe rasdlg.dll raserver.exe rasgcw.dll rasman.dll rasmans.dll rasmbmgr.dll RasMediaManager.dll RASMM.dll rasmontr.dll rasphone.exe > > rasplap.dll rasppp.dll rastapi.dll rastls.dll rastlsext.dll RasToast rdbui.dll rdpbase.dll rdpcfgex.dll rdpclip.exe rdpcore.dll rdpcorets.dll rdpcredentialprovider.dll rdpencom.dll rdpendp.dll rdpinit.exe rdpinput.exe rdpnano.dll RdpRelayTransport.dll RdpSa.exe RdpS > > aProxy.exe RdpSaPs.dll RdpSaUacHelper.exe rdpserverbase.dll rdpsharercom.dll rdpshell.exe rdpsign.exe rdpudd.dll rdpviewerax.dll rdrleakdiag.exe RDSAppXHelper.dll rdsdwmdr.dll rdsxvmaudio.dll rdvvmtransport.dll RDXService.dll RDXTaskFactory.dll ReAgent.dll ReAgentc.e > > xe ReAgentTask.dll recdisc.exe recover.exe Recovery recovery.dll RecoveryDrive.exe refsutil.exe reg.exe regapi.dll RegCtrl.dll regedt32.exe regidle.dll regini.exe Register-CimProvider.exe regsvc.dll regsvr32.exe reguwpapi.dll ReInfo.dll rekeywiz.exe relog.exe RelPost > > .exe RemoteAppLifetimeManager.exe RemoteAppLifetimeManagerProxyStub.dll remoteaudioendpoint.dll remotepg.dll RemotePosWorker.exe remotesp.tsp RemoteSystemToastIcon.contrast-white.png RemoteSystemToastIcon.png RemoteWipeCSP.dll RemovableMediaProvisioningPlugin.dll Rem > > oveDeviceContextHandler.dll RemoveDeviceElevated.dll rendezvousSession.tlb repair-bde.exe replace.exe ReportingCSP.dll RESAMPLEDMO.DLL ResBParser.dll reset.exe reseteng.dll ResetEngine.dll ResetEngine.exe ResetEngOnline.dll resmon.exe ResourceMapper.dll ResourcePolic > > yClient.dll ResourcePolicyServer.dll ResPriHMImageList ResPriHMImageListLowCost ResPriImageList ResPriImageListLowCost RestartManager.mof RestartManagerUninstall.mof RestartNowPower_80.contrast-black.png RestartNowPower_80.contrast-white.png RestartNowPower_80.png Re > > startTonight_80.png RestartTonight_80_contrast-black.png RestartTonight_80_contrast-white.png restore resutils.dll rgb9rast.dll Ribbons.scr riched20.dll riched32.dll rilproxy.dll RjvMDMConfig.dll RMActivate.exe RMActivate_isv.exe RMActivate_ssp.exe RMActivate_ssp_isv > > .exe RMapi.dll rmclient.dll RmClient.exe RMSRoamingSecurity.dll rmttpmvscmgrsvr.exe rnr20.dll ro-RO RoamingSecurity.dll Robocopy.exe rometadata.dll RotMgr.dll ROUTE.EXE RpcEpMap.dll rpchttp.dll RpcNs4.dll rpcnsh.dll RpcPing.exe rpcrt4.dll RpcRtRemote.dll rpcss.dll rr > > installer.exe rsaenh.dll rshx32.dll rsop.msc RstMwEventLogMsg.dll RstrtMgr.dll rstrui.exe RtCOM64.dll RtDataProc64.dll rtffilt.dll RtkApi64U.dll RtkAudUService64.exe RtkCfg64.dll rtm.dll rtmcodecs.dll RTMediaFrame.dll rtmmvrortc.dll rtmpal.dll rtmpltfm.dll rtutils.dl > > l RTWorkQ.dll ru-RU RuleBasedDS.dll runas.exe rundll32.exe runexehelper.exe RunLegacyCPLElevated.exe runonce.exe RuntimeBroker.exe rwinsta.exe samcli.dll samlib.dll samsrv.dll Samsung sas.dll sbe.dll sbeio.dll sberes.dll sbservicetrigger.dll sc.exe ScanPlugin.dll sca > > nsetting.dll SCardBi.dll SCardDlg.dll SCardSvr.dll ScavengeSpace.xml scavengeui.dll ScDeviceEnum.dll scecli.dll scesrv.dll schannel.dll schedcli.dll schedsvc.dll ScheduleTime_80.contrast-black.png ScheduleTime_80.contrast-white.png ScheduleTime_80.png schtasks.exe sc > > ksp.dll scripto.dll ScriptRunner.exe scrnsave.scr scrobj.dll scrptadm.dll scrrun.dll sdbinst.exe sdchange.exe sdclt.exe sdcpl.dll SDDS.dll sdengin2.dll SDFHost.dll sdhcinst.dll sdiageng.dll sdiagnhost.exe sdiagprv.dll sdiagschd.dll sdohlp.dll sdrsvc.dll sdshext.dll S > > earch.ProtocolHandler.MAPI2.dll SearchFilterHost.exe SearchFolder.dll SearchIndexer.exe SearchProtocolHost.exe SebBackgroundManagerPolicy.dll SecConfig.efi SecEdit.exe sechost.dll secinit.exe seclogon.dll secpol.msc secproc.dll secproc_isv.dll secproc_ssp.dll secproc > > _ssp_isv.dll secur32.dll SecureAssessmentHandlers.dll SecureBootUpdates securekernel.exe SecureTimeAggregator.dll security.dll SecurityAndMaintenance.png SecurityAndMaintenance_Alert.png SecurityAndMaintenance_Error.png SecurityCenterBroker.dll SecurityCenterBrokerPS > > .dll SecurityHealthAgent.dll SecurityHealthHost.exe SecurityHealthProxyStub.dll SecurityHealthService.exe SecurityHealthSSO.dll SecurityHealthSystray.exe sedplugins.dll SEMgrPS.dll SEMgrSvc.dll sendmail.dll Sens.dll SensApi.dll SensorDataService.exe SensorPerformance > > Events.dll SensorsApi.dll SensorsClassExtension.dll SensorsCpl.dll SensorService.dll SensorsNativeApi.dll SensorsNativeApi.V2.dll SensorsUtilsV2.dll sensrsvc.dll serialui.dll services.exe services.msc ServicingUAPI.dll serwvdrv.dll SessEnv.dll sessionmsg.exe setbcdlo > > cale.dll sethc.exe SetNetworkLocation.dll SetNetworkLocationFlyout.dll SetProxyCredential.dll setspn.exe SettingMonitor.dll settings.dat SettingsEnvironment.Desktop.dll SettingsExtensibilityHandlers.dll SettingsHandlers_Accessibility.dll SettingsHandlers_AnalogShell. > > dll SettingsHandlers_AppControl.dll SettingsHandlers_AppExecutionAlias.dll SettingsHandlers_AssignedAccess.dll SettingsHandlers_Authentication.dll SettingsHandlers_BackgroundApps.dll SettingsHandlers_BatteryUsage.dll SettingsHandlers_BrowserDeclutter.dll SettingsHand > > lers_CapabilityAccess.dll SettingsHandlers_Clipboard.dll SettingsHandlers_ClosedCaptioning.dll SettingsHandlers_ContentDeliveryManager.dll SettingsHandlers_Cortana.dll SettingsHandlers_Devices.dll SettingsHandlers_Display.dll SettingsHandlers_Flights.dll SettingsHand > > lers_Fonts.dll SettingsHandlers_ForceSync.dll SettingsHandlers_Gaming.dll SettingsHandlers_Geolocation.dll SettingsHandlers_Gpu.dll SettingsHandlers_HoloLens_Environment.dll SettingsHandlers_IME.dll SettingsHandlers_InkingTypingPrivacy.dll SettingsHandlers_InputPerso > > nalization.dll SettingsHandlers_Language.dll SettingsHandlers_ManagePhone.dll SettingsHandlers_Maps.dll SettingsHandlers_Mouse.dll SettingsHandlers_Notifications.dll SettingsHandlers_nt.dll SettingsHandlers_OneCore_BatterySaver.dll SettingsHandlers_OneCore_PowerAndSl > > eep.dll SettingsHandlers_OneDriveBackup.dll SettingsHandlers_OptionalFeatures.dll SettingsHandlers_PCDisplay.dll SettingsHandlers_Pen.dll SettingsHandlers_QuickActions.dll SettingsHandlers_Region.dll SettingsHandlers_SharedExperiences_Rome.dll SettingsHandlers_SIUF.d > > ll SettingsHandlers_SpeechPrivacy.dll SettingsHandlers_Startup.dll SettingsHandlers_StorageSense.dll SettingsHandlers_Troubleshoot.dll SettingsHandlers_User.dll SettingsHandlers_UserAccount.dll SettingsHandlers_UserExperience.dll SettingsHandlers_WorkAccess.dll Setti > > ngSync.dll SettingSyncCore.dll SettingSyncDownloadHelper.dll SettingSyncHost.exe setup setupapi.dll setupcl.dll setupcl.exe setupcln.dll setupetw.dll setupugc.exe setx.exe sfc.dll sfc.exe sfc_os.dll Sgrm SgrmBroker.exe SgrmEnclave.dll SgrmEnclave_secure.dll SgrmLpac. > > exe shacct.dll shacctprofile.dll SharedPCCSP.dll SharedRealitySvc.dll ShareHost.dll sharemediacpl.dll SHCore.dll shdocvw.dll shell32.dll ShellAppRuntime.exe ShellCommonCommonProxyStub.dll ShellExperiences shellstyle.dll shfolder.dll shgina.dll ShiftJIS.uce shimeng.dl > > l shimgvw.dll shlwapi.dll shpafact.dll shrpubw.exe shsetup.dll shsvcs.dll shunimpl.dll shutdown.exe shutdownext.dll shutdownux.dll shwebsvc.dll si-lk signdrv.dll sigverif.exe SIHClient.exe sihost.exe SimAuth.dll SimCfg.dll simpdata.tlb sk-SK skci.dll sl-SI slc.dll sl > > cext.dll SleepStudy SlideToShutDown.exe slmgr slmgr.vbs slui.exe slwga.dll SmallRoom.bin SmartCardBackgroundPolicy.dll SmartcardCredentialProvider.dll SmartCardSimulator.dll smartscreen.exe smartscreenps.dll SMBHelperClass.dll smbwmiv2.dll SMI SmiEngine.dll smphost.d > > ll SmsRouterSvc.dll smss.exe SndVol.exe SndVolSSO.dll SnippingTool.exe snmpapi.dll snmptrap.exe Snooze_80.contrast-black.png Snooze_80.contrast-white.png Snooze_80.png socialapis.dll softkbd.dll softpub.dll sort.exe SortServer2003Compat.dll SortWindows61.dll SortWind > > ows62.dll SortWindows64.dll SortWindows6Compat.dll SpaceAgent.exe spacebridge.dll SpaceControl.dll spaceman.exe SpatialAudioLicenseSrv.exe SpatializerApo.dll SpatialStore.dll spbcd.dll SpeakersSystemToastIcon.contrast-white.png SpeakersSystemToastIcon.png Spectrum.ex > > e SpectrumSyncClient.dll Speech SpeechPal.dll Speech_OneCore spfileq.dll spinf.dll spmpm.dll spnet.dll spool spoolss.dll spoolsv.exe spopk.dll spp spp.dll sppc.dll sppcext.dll sppcomapi.dll sppcommdlg.dll SppExtComObj.Exe sppinst.dll sppnp.dll sppobjs.dll sppsvc.exe > > sppui sppwinob.dll sppwmi.dll spwinsat.dll spwizeng.dll spwizimg.dll spwizres.dll spwmp.dll SqlServerSpatial130.dll SqlServerSpatial150.dll sqlsrv32.dll sqlsrv32.rll sqmapi.dll sr-Latn-RS srchadmin.dll srclient.dll srcore.dll srdelayed.exe SrEvents.dll SRH.dll srhelp > > er.dll srm.dll srmclient.dll srmlib.dll srms-apr-v.dat srms-apr.dat srms.dat srmscan.dll srmshell.dll srmstormod.dll srmtrace.dll srm_ps.dll srpapi.dll SrpUxNativeSnapIn.dll srrstr.dll SrTasks.exe sru srumapi.dll srumsvc.dll srvcli.dll srvsvc.dll srwmi.dll sscore.dll > > sscoreext.dll ssdm.dll ssdpapi.dll ssdpsrv.dll sspicli.dll sspisrv.dll SSShim.dll ssText3d.scr sstpsvc.dll StartTileData.dll Startupscan.dll StateRepository.Core.dll stclient.dll stdole2.tlb stdole32.tlb sti.dll sti_ci.dll stobject.dll StorageContextHandler.dll Stor > > ageUsage.dll storagewmi.dll storagewmi_passthru.dll stordiag.exe storewuauth.dll Storprop.dll StorSvc.dll streamci.dll StringFeedbackEngine.dll StructuredQuery.dll SubRange.uce subst.exe sud.dll sv-SE SvBannerBackground.png svchost.exe svf.dll svsvc.dll SwitcherDataM > > odel.dll swprv.dll sxproxy.dll sxs.dll sxshared.dll sxssrv.dll sxsstore.dll sxstrace.exe SyncAppvPublishingServer.exe SyncAppvPublishingServer.vbs SyncCenter.dll SyncController.dll SyncHost.exe SyncHostps.dll SyncInfrastructure.dll SyncInfrastructureps.dll SyncProxy. > > dll Syncreg.dll SyncRes.dll SyncSettings.dll syncutil.dll sysclass.dll sysdm.cpl SysFxUI.dll sysmain.dll sysmon.ocx sysntfy.dll Sysprep sysprint.sep sysprtj.sep SysResetErr.exe syssetup.dll systemcpl.dll SystemEventsBrokerClient.dll SystemEventsBrokerServer.dll syste > > minfo.exe SystemPropertiesAdvanced.exe SystemPropertiesComputerName.exe SystemPropertiesDataExecutionPrevention.exe SystemPropertiesHardware.exe SystemPropertiesPerformance.exe SystemPropertiesProtection.exe SystemPropertiesRemote.exe systemreset.exe SystemResetPlatf > > orm SystemSettings.DataModel.dll SystemSettings.DeviceEncryptionHandlers.dll SystemSettings.Handlers.dll SystemSettings.SettingsExtensibility.dll SystemSettings.UserAccountsHandlers.dll SystemSettingsAdminFlows.exe SystemSettingsBroker.exe SystemSettingsRemoveDevice. > > exe SystemSettingsThresholdAdminFlowUI.dll SystemSupportInfo.dll SystemUWPLauncher.exe systray.exe t2embed.dll ta-in ta-lk Tabbtn.dll TabbtnEx.dll tabcal.exe TabletPC.cpl TabSvc.dll takeown.exe tapi3.dll tapi32.dll tapilua.dll TapiMigPlugin.dll tapiperf.dll tapisrv.d > > ll TapiSysprep.dll tapiui.dll TapiUnattend.exe tar.exe TaskApis.dll taskbarcpl.dll taskcomp.dll TaskFlowDataEngine.dll taskhostw.exe taskkill.exe tasklist.exe Taskmgr.exe Tasks taskschd.dll taskschd.msc TaskSchdPS.dll tbauth.dll tbs.dll tcblaunch.exe tcbloader.dll tc > > msetup.exe tcpbidi.xml tcpipcfg.dll tcpmib.dll tcpmon.dll tcpmon.ini tcpmonui.dll TCPSVCS.EXE tdc.ocx tdh.dll TDLMigration.dll TEEManagement64.dll telephon.cpl TelephonyInteractiveUser.dll TelephonyInteractiveUserRes.dll tellib.dll TempSignedLicenseExchangeTask.dll T > > enantRestrictionsPlugin.dll termmgr.dll termsrv.dll tetheringclient.dll tetheringconfigsp.dll TetheringIeProvider.dll TetheringMgr.dll tetheringservice.dll TetheringStation.dll TextInputFramework.dll TextInputMethodFormatter.dll TextShaping.dll th-TH themecpl.dll The > > mes.SsfDownload.ScheduledTask.dll themeservice.dll themeui.dll ThirdPartyNoticesBySHS.txt threadpoolwinrt.dll thumbcache.dll ThumbnailExtractionHost.exe ti-et tier2punctuations.dll TieringEngineProxy.dll TieringEngineService.exe TileDataRepository.dll TimeBrokerClien > > t.dll TimeBrokerServer.dll timedate.cpl TimeDateMUICallback.dll timeout.exe timesync.dll TimeSyncTask.dll TKCtrl2k64.sys TKFsAv64.sys TKFsFt64.sys TKFWFV.inf TKFWFV64.cat TKFWFV64.sys tkfwvt64.sys TKIdsVt64.sys TKPcFtCb64.sys TKPcFtCb64.sys_ TKPcFtHk64.sys TKRgAc2k64 > > .sys TKRgFtXp64.sys TKTool2k.sys TKTool2k64.sys tlscsp.dll tokenbinding.dll TokenBroker.dll TokenBrokerCookies.exe TokenBrokerUI.dll tpm.msc TpmCertResources.dll tpmcompc.dll TpmCoreProvisioning.dll TpmInit.exe TpmTasks.dll TpmTool.exe tpmvsc.dll tpmvscmgr.exe tpmvsc > > mgrsvr.exe tquery.dll tr-TR tracerpt.exe TRACERT.EXE traffic.dll TransformPPSToWlan.xslt TransformPPSToWlanCredentials.xslt TransliterationRanker.dll TransportDSA.dll tree.com trie.dll trkwks.dll TrustedSignalCredProv.dll tsbyuv.dll tscfgwmi.dll tscon.exe tsdiscon.ex > > e TSErrRedir.dll tsf3gip.dll tsgqec.dll tskill.exe tsmf.dll TSpkg.dll tspubwmi.dll TSSessionUX.dll tssrvlic.dll TSTheme.exe TsUsbGDCoInstaller.dll TsUsbRedirectionGroupPolicyExtension.dll TSWbPrxy.exe TSWorkspace.dll TsWpfWrp.exe ttdinject.exe ttdloader.dll ttdplm.dl > > l ttdrecord.dll ttdrecordcpu.dll TtlsAuth.dll TtlsCfg.dll TtlsExt.dll tttracer.exe tvratings.dll twext.dll twinapi.appcore.dll twinapi.dll twinui.appcore.dll twinui.dll twinui.pcshell.dll txflog.dll txfw32.dll typeperf.exe tzautoupdate.dll tzres.dll tzsync.exe tzsync > > res.dll tzutil.exe ubpm.dll ucmhc.dll ucrtbase.dll ucrtbased.dll ucrtbase_clr0400.dll ucrtbase_enclave.dll ucsvc.exe udhisapi.dll uDWM.dll UefiCsp.dll UevAgentPolicyGenerator.exe UevAppMonitor.exe UevAppMonitor.exe.config UevCustomActionTypes.tlb UevTemplateBaselineG > > enerator.exe UevTemplateConfigItemGenerator.exe uexfat.dll ufat.dll UiaManager.dll UIAnimation.dll UIAutomationCore.dll uicom.dll UIManagerBrokerps.dll UIMgrBroker.exe uireng.dll UIRibbon.dll UIRibbonRes.dll uk-UA ulib.dll umb.dll umdmxfrm.dll umpdc.dll umpnpmgr.dll > > umpo-overrides.dll umpo.dll umpoext.dll umpowmi.dll umrdp.dll unattend.dll unenrollhook.dll unimdm.tsp unimdmat.dll uniplat.dll Unistore.dll unlodctr.exe UNP unregmp2.exe untfs.dll UpdateAgent.dll updatecsp.dll UpdateDeploymentProvider.dll UpdateHeartbeat.dll updatep > > olicy.dll upfc.exe UpgradeResultsUI.exe upnp.dll upnpcont.exe upnphost.dll UPPrinterInstaller.exe UPPrinterInstallsCSP.dll upshared.dll uReFS.dll uReFSv1.dll ureg.dll url.dll urlmon.dll UsbCApi.dll usbceip.dll usbmon.dll usbperf.dll UsbPmApi.dll UsbSettingsHandlers.d > > ll UsbTask.dll usbui.dll user32.dll UserAccountBroker.exe UserAccountControlSettings.dll UserAccountControlSettings.exe useractivitybroker.dll usercpl.dll UserDataAccessRes.dll UserDataAccountApis.dll UserDataLanguageUtil.dll UserDataPlatformHelperUtil.dll UserDataSe > > rvice.dll UserDataTimeUtil.dll UserDataTypeHelperUtil.dll UserDeviceRegistration.dll UserDeviceRegistration.Ngc.dll userenv.dll userinit.exe userinitext.dll UserLanguageProfileCallback.dll usermgr.dll usermgrcli.dll UserMgrProxy.dll usk.rs usoapi.dll UsoClient.exe us > > ocoreps.dll usocoreworker.exe usosvc.dll usp10.dll ustprov.dll UtcDecoderHost.exe UtcManaged.dll utcutil.dll utildll.dll Utilman.exe uudf.dll UvcModel.dll uwfcfgmgmt.dll uwfcsp.dll uwfservicingapi.dll UXInit.dll uxlib.dll uxlibres.dll uxtheme.dll vac.dll VAN.dll Vaul > > t.dll VaultCDS.dll vaultcli.dll VaultCmd.exe VaultRoaming.dll vaultsvc.dll VBICodec.ax vbisurf.ax vbsapi.dll vbscript.dll vbssysprep.dll vcamp120.dll vcamp140.dll vcamp140d.dll VCardParser.dll vccorlib110.dll vccorlib120.dll vccorlib140.dll vccorlib140d.dll vcomp100. > > dll vcomp110.dll vcomp120.dll vcomp140.dll vcomp140d.dll vcruntime140.dll vcruntime140d.dll vcruntime140_1.dll vcruntime140_1d.dll vcruntime140_clr0400.dll vds.exe vdsbas.dll vdsdyn.dll vdsldr.exe vdsutil.dll vdsvd.dll vds_ps.dll verclsid.exe verifier.dll verifier.ex > > e verifiergui.exe version.dll vertdll.dll vfbasics.dll vfcompat.dll vfcuzz.dll vfluapriv.dll vfnet.dll vfntlmless.dll vfnws.dll vfprint.dll vfprintpthelper.dll vfrdvcompat.dll vfuprov.dll vfwwdm32.dll VhfUm.dll vid.dll vidcap.ax VideoHandlers.dll VIDRESZR.DLL virtdis > > k.dll VirtualMonitorManager.dll VmApplicationHealthMonitorProxy.dll vmbuspipe.dll vmdevicehost.dll vmictimeprovider.dll vmrdvcore.dll VocabRoamingHandler.dll VoiceActivationManager.dll VoipRT.dll vpnike.dll vpnikeapi.dll VpnSohDesktop.dll VPNv2CSP.dll vrfcore.dll Vsc > > MgrPS.dll vscover160.dll VSD3DWARPDebug.dll VsGraphicsCapture.dll VsGraphicsDesktopEngine.exe VsGraphicsExperiment.dll VsGraphicsHelper.dll VsGraphicsProxyStub.dll VsGraphicsRemoteEngine.exe vsjitdebugger.exe VSPerf160.dll vssadmin.exe vssapi.dll vsstrace.dll VSSVC.e > > xe vss_ps.dll vulkan-1-999-0-0-0.dll vulkan-1.dll vulkaninfo-1-999-0-0-0.exe vulkaninfo.exe w32time.dll w32tm.exe w32topl.dll WaaSAssessment.dll WaaSMedicAgent.exe WaaSMedicCapsule.dll WaaSMedicPS.dll WaaSMedicSvc.dll WABSyncProvider.dll waitfor.exe WalletBackgroundS > > erviceProxy.dll WalletProxy.dll WalletService.dll WallpaperHost.exe wavemsp.dll wbadmin.exe wbem wbemcomn.dll wbengine.exe wbiosrvc.dll wci.dll wcimage.dll wcmapi.dll wcmcsp.dll wcmsvc.dll WCN WcnApi.dll wcncsvc.dll WcnEapAuthProxy.dll WcnEapPeerProxy.dll WcnNetsh.dl > > l wcnwiz.dll wc_storage.dll wdc.dll WDI wdi.dll wdigest.dll wdmaud.drv wdscore.dll WdsUnattendTemplate.xml WEB.rs webauthn.dll WebcamUi.dll webcheck.dll WebClnt.dll webio.dll webplatstorageserver.dll WebRuntimeManager.dll webservices.dll Websocket.dll wecapi.dll wecs > > vc.dll wecutil.exe wephostsvc.dll wer.dll werconcpl.dll wercplsupport.dll werdiagcontroller.dll WerEnc.dll weretw.dll WerFault.exe WerFaultSecure.exe wermgr.exe wersvc.dll werui.dll wevtapi.dll wevtfwd.dll wevtsvc.dll wevtutil.exe wextract.exe WF.msc wfapigp.dll wfdp > > rov.dll WFDSConMgr.dll WFDSConMgrSvc.dll WfHC.dll WFS.exe WFSR.dll whealogr.dll where.exe whhelper.dll whoami.exe wiaacmgr.exe wiaaut.dll wiadefui.dll wiadss.dll WiaExtensionHost64.dll wiarpc.dll wiascanprofiles.dll wiaservc.dll wiashext.dll wiatrace.dll wiawow64.exe > > WiFiCloudStore.dll WiFiConfigSP.dll wifidatacapabilityhandler.dll WiFiDisplay.dll wifinetworkmanager.dll wifitask.exe WimBootCompress.ini wimgapi.dll wimserv.exe win32appinventorycsp.dll Win32AppSettingsProvider.dll Win32CompatibilityAppraiserCSP.dll win32k.sys win3 > > 2kbase.sys win32kfull.sys win32kns.sys win32spl.dll win32u.dll Win32_DeviceGuard.dll winbio.dll WinBioDatabase WinBioDataModel.dll WinBioDataModelOOBE.exe winbioext.dll WinBioPlugIns winbrand.dll wincorlib.dll wincredprovider.dll wincredui.dll WindowManagement.dll Wi > > ndowManagementAPI.dll Windows.AccountsControl.dll Windows.AI.MachineLearning.dll Windows.AI.MachineLearning.Preview.dll Windows.ApplicationModel.Background.SystemEventsBroker.dll Windows.ApplicationModel.Background.TimeBroker.dll Windows.ApplicationModel.Conversation > > alAgent.dll windows.applicationmodel.conversationalagent.internal.proxystub.dll windows.applicationmodel.conversationalagent.proxystub.dll Windows.ApplicationModel.Core.dll windows.applicationmodel.datatransfer.dll Windows.ApplicationModel.dll Windows.ApplicationMode > > l.LockScreen.dll Windows.ApplicationModel.Store.dll Windows.ApplicationModel.Store.Preview.DOSettings.dll Windows.ApplicationModel.Store.TestingFramework.dll Windows.ApplicationModel.Wallet.dll Windows.CloudStore.dll Windows.CloudStore.Schema.DesktopShell.dll Windows > > .CloudStore.Schema.Shell.dll Windows.Cortana.Desktop.dll Windows.Cortana.OneCore.dll Windows.Cortana.ProxyStub.dll Windows.Data.Activities.dll Windows.Data.Pdf.dll Windows.Devices.AllJoyn.dll Windows.Devices.Background.dll Windows.Devices.Background.ps.dll Windows.De > > vices.Bluetooth.dll Windows.Devices.Custom.dll Windows.Devices.Custom.ps.dll Windows.Devices.Enumeration.dll Windows.Devices.Haptics.dll Windows.Devices.HumanInterfaceDevice.dll Windows.Devices.Lights.dll Windows.Devices.LowLevel.dll Windows.Devices.Midi.dll Windows. > > Devices.Perception.dll Windows.Devices.Picker.dll Windows.Devices.PointOfService.dll Windows.Devices.Portable.dll Windows.Devices.Printers.dll Windows.Devices.Printers.Extensions.dll Windows.Devices.Radios.dll Windows.Devices.Scanners.dll Windows.Devices.Sensors.dll > > Windows.Devices.SerialCommunication.dll Windows.Devices.SmartCards.dll Windows.Devices.SmartCards.Phone.dll Windows.Devices.Usb.dll Windows.Devices.WiFi.dll Windows.Devices.WiFiDirect.dll Windows.Energy.dll Windows.FileExplorer.Common.dll Windows.Gaming.Input.dll Win > > dows.Gaming.Preview.dll Windows.Gaming.UI.GameBar.dll Windows.Gaming.XboxLive.Storage.dll Windows.Globalization.dll Windows.Globalization.Fontgroups.dll Windows.Globalization.PhoneNumberFormatting.dll Windows.Graphics.Display.BrightnessOverride.dll Windows.Graphics.D > > isplay.DisplayEnhancementOverride.dll Windows.Graphics.dll Windows.Graphics.Printing.3D.dll Windows.Graphics.Printing.dll Windows.Graphics.Printing.Workflow.dll Windows.Graphics.Printing.Workflow.Native.dll Windows.Help.Runtime.dll windows.immersiveshell.serviceprovi > > der.dll Windows.Internal.AdaptiveCards.XamlCardRenderer.dll Windows.Internal.Bluetooth.dll Windows.Internal.CapturePicker.Desktop.dll Windows.Internal.CapturePicker.dll Windows.Internal.Devices.Sensors.dll Windows.Internal.Feedback.Analog.dll Windows.Internal.Feedbac > > k.Analog.ProxyStub.dll Windows.Internal.Graphics.Display.DisplayColorManagement.dll Windows.Internal.Graphics.Display.DisplayEnhancementManagement.dll Windows.Internal.Management.dll Windows.Internal.Management.SecureAssessment.dll Windows.Internal.PlatformExtension. > > DevicePickerExperience.dll Windows.Internal.PlatformExtension.MiracastBannerExperience.dll Windows.Internal.PredictionUnit.dll Windows.Internal.Security.Attestation.DeviceAttestation.dll Windows.Internal.SecurityMitigationsBroker.dll Windows.Internal.Shell.Broker.dll > > windows.internal.shellcommon.AccountsControlExperience.dll windows.internal.shellcommon.AppResolverModal.dll Windows.Internal.ShellCommon.Broker.dll windows.internal.shellcommon.FilePickerExperienceMEM.dll Windows.Internal.ShellCommon.PrintExperience.dll windows.int > > ernal.shellcommon.shareexperience.dll windows.internal.shellcommon.TokenBrokerModal.dll Windows.Internal.Signals.dll Windows.Internal.System.UserProfile.dll Windows.Internal.Taskbar.dll Windows.Internal.UI.BioEnrollment.ProxyStub.dll Windows.Internal.UI.Logon.ProxySt > > ub.dll Windows.Internal.UI.Shell.WindowTabManager.dll Windows.Management.EnrollmentStatusTracking.ConfigProvider.dll Windows.Management.InprocObjects.dll Windows.Management.ModernDeployment.ConfigProviders.dll Windows.Management.Provisioning.ProxyStub.dll Windows.Man > > agement.SecureAssessment.CfgProvider.dll Windows.Management.SecureAssessment.Diagnostics.dll Windows.Management.Service.dll Windows.Management.Workplace.dll Windows.Management.Workplace.WorkplaceSettings.dll Windows.Media.Audio.dll Windows.Media.BackgroundMediaPlayba > > ck.dll Windows.Media.BackgroundPlayback.exe Windows.Media.Devices.dll Windows.Media.dll Windows.Media.Editing.dll Windows.Media.FaceAnalysis.dll Windows.Media.Import.dll Windows.Media.MediaControl.dll Windows.Media.MixedRealityCapture.dll Windows.Media.Ocr.dll Window > > s.Media.Playback.BackgroundMediaPlayer.dll Windows.Media.Playback.MediaPlayer.dll Windows.Media.Playback.ProxyStub.dll Windows.Media.Protection.PlayReady.dll Windows.Media.Renewal.dll Windows.Media.Speech.dll Windows.Media.Speech.UXRes.dll Windows.Media.Streaming.dll > > Windows.Media.Streaming.ps.dll Windows.Mirage.dll Windows.Mirage.Internal.Capture.Pipeline.ProxyStub.dll Windows.Mirage.Internal.dll Windows.Networking.BackgroundTransfer.BackgroundManagerPolicy.dll Windows.Networking.BackgroundTransfer.ContentPrefetchTask.dll Windo > > ws.Networking.BackgroundTransfer.dll Windows.Networking.Connectivity.dll Windows.Networking.dll Windows.Networking.HostName.dll Windows.Networking.NetworkOperators.ESim.dll Windows.Networking.NetworkOperators.HotspotAuthentication.dll Windows.Networking.Proximity.dll > > Windows.Networking.ServiceDiscovery.Dnssd.dll Windows.Networking.Sockets.PushEnabledApplication.dll Windows.Networking.UX.EapRequestHandler.dll Windows.Networking.Vpn.dll Windows.Networking.XboxLive.ProxyStub.dll Windows.Payments.dll Windows.Perception.Stub.dll Wind > > ows.Security.Authentication.Identity.Provider.dll Windows.Security.Authentication.OnlineId.dll Windows.Security.Authentication.Web.Core.dll Windows.Security.Credentials.UI.CredentialPicker.dll Windows.Security.Credentials.UI.UserConsentVerifier.dll Windows.Security.I > > ntegrity.dll Windows.Services.TargetedContent.dll Windows.SharedPC.AccountManager.dll Windows.SharedPC.CredentialProvider.dll Windows.Shell.BlueLightReduction.dll Windows.Shell.ServiceHostBuilder.dll Windows.Shell.StartLayoutPopulationEvents.dll Windows.StateReposito > > ry.dll Windows.StateRepositoryBroker.dll Windows.StateRepositoryClient.dll Windows.StateRepositoryCore.dll Windows.StateRepositoryPS.dll Windows.StateRepositoryUpgrade.dll Windows.Storage.ApplicationData.dll Windows.Storage.Compression.dll windows.storage.dll Windows > > .Storage.OneCore.dll Windows.Storage.Search.dll Windows.System.Diagnostics.dll Windows.System.Diagnostics.Telemetry.PlatformTelemetryClient.dll Windows.System.Diagnostics.TraceReporting.PlatformDiagnosticActions.dll Windows.System.Launcher.dll Windows.System.Profile. > > HardwareId.dll Windows.System.Profile.PlatformDiagnosticsAndUsageDataSettings.dll Windows.System.Profile.RetailInfo.dll Windows.System.Profile.SystemId.dll Windows.System.Profile.SystemManufacturers.dll Windows.System.RemoteDesktop.dll Windows.System.SystemManagement > > .dll Windows.System.UserDeviceAssociation.dll Windows.System.UserProfile.DiagnosticsSettings.dll Windows.UI.Accessibility.dll Windows.UI.AppDefaults.dll Windows.UI.BioFeedback.dll Windows.UI.BlockedShutdown.dll Windows.UI.Core.TextInput.dll Windows.UI.Cred.dll Window > > s.UI.CredDialogController.dll Windows.UI.dll Windows.UI.FileExplorer.dll Windows.UI.Immersive.dll Windows.UI.Input.Inking.Analysis.dll Windows.UI.Input.Inking.dll Windows.UI.Internal.Input.ExpressiveInput.dll Windows.UI.Internal.Input.ExpressiveInput.Resource.dll Win > > dows.UI.Logon.dll Windows.UI.NetworkUXController.dll Windows.UI.PicturePassword.dll Windows.UI.Search.dll Windows.UI.Shell.dll Windows.UI.Shell.Internal.AdaptiveCards.dll Windows.UI.Storage.dll Windows.UI.Xaml.Controls.dll Windows.UI.Xaml.dll Windows.UI.Xaml.InkContr > > ols.dll Windows.UI.Xaml.Maps.dll Windows.UI.Xaml.Phone.dll Windows.UI.Xaml.Resources.19h1.dll Windows.UI.Xaml.Resources.Common.dll Windows.UI.Xaml.Resources.rs1.dll Windows.UI.Xaml.Resources.rs2.dll Windows.UI.Xaml.Resources.rs3.dll Windows.UI.Xaml.Resources.rs4.dll > > Windows.UI.Xaml.Resources.rs5.dll Windows.UI.Xaml.Resources.th.dll Windows.UI.Xaml.Resources.win81.dll Windows.UI.Xaml.Resources.win8rtm.dll Windows.UI.XamlHost.dll Windows.WARP.JITService.dll Windows.WARP.JITService.exe Windows.Web.Diagnostics.dll Windows.Web.dll Wi > > ndows.Web.Http.dll WindowsActionDialog.exe WindowsCodecs.dll WindowsCodecsExt.dll WindowsCodecsRaw.dll WindowsCodecsRaw.txt WindowsDefaultHeatProcessor.dll windowsdefenderapplicationguardcsp.dll WindowsInternal.ComposableShell.ComposerFramework.dll WindowsInternal.Co > > mposableShell.DesktopHosting.dll WindowsInternal.Shell.CompUiActivation.dll WindowsIoTCsp.dll windowslivelogin.dll WindowsManagementServiceWinRt.ProxyStub.dll windowsperformancerecordercontrol.dll WindowsPowerShell WindowsSecurityIcon.png windowsudk.shellcommon.dll W > > indowsUpdateElevatedInstaller.exe winethc.dll winevt WinFax.dll winhttp.dll winhttpcom.dll WinHvEmulation.dll WinHvPlatform.dll wininet.dll wininetlui.dll wininit.exe wininitext.dll winipcfile.dll winipcsecproc.dll winipsec.dll winjson.dll Winlangdb.dll winload.efi w > > inload.exe winlogon.exe winlogonext.dll winmde.dll WinMetadata winml.dll winmm.dll winmmbase.dll winmsipc.dll WinMsoIrmProtector.dll winnlsres.dll winnsi.dll WinOpcIrmProtector.dll WinREAgent.dll winresume.efi winresume.exe winrm winrm.cmd winrm.vbs winrnr.dll winrs. > > exe winrscmd.dll winrshost.exe winrsmgr.dll winrssrv.dll WinRTNetMUAHostServer.exe WinRtTracing.dll WinSAT.exe WinSATAPI.dll WinSCard.dll WinSetupUI.dll winshfhc.dll winsku.dll winsockhc.dll winspool.drv winsqlite3.dll WINSRPC.DLL winsrv.dll winsrvext.dll winsta.dll > > WinSync.dll WinSyncMetastore.dll WinSyncProviders.dll wintrust.dll WinTypes.dll winusb.dll winver.exe WiredNetworkCSP.dll wisp.dll witnesswmiv2provider.dll wkscli.dll wkspbroker.exe wkspbrokerAx.dll wksprt.exe wksprtPS.dll wkssvc.dll wlanapi.dll wlancfg.dll WLanConn. > > dll wlandlg.dll wlanext.exe wlangpui.dll WLanHC.dll wlanhlp.dll WlanMediaManager.dll WlanMM.dll wlanmsm.dll wlanpref.dll WlanRadioManager.dll wlansec.dll wlansvc.dll wlansvcpal.dll wlanui.dll wlanutil.dll Wldap32.dll wldp.dll wlgpclnt.dll wlidcli.dll wlidcredprov.dll > > wlidfdp.dll wlidnsp.dll wlidprov.dll wlidres.dll wlidsvc.dll wlrmdr.exe WMADMOD.DLL WMADMOE.DLL WMALFXGFXDSP.dll WMASF.DLL wmcodecdspps.dll wmdmlog.dll wmdmps.dll wmdrmsdk.dll wmerror.dll wmi.dll wmiclnt.dll wmicmiplugin.dll wmidcom.dll wmidx.dll WmiMgmt.msc wmiprop > > .dll wmitomi.dll WMNetMgr.dll wmp.dll WMPDMC.exe WmpDui.dll wmpdxm.dll wmpeffects.dll WMPhoto.dll wmploc.DLL wmpps.dll wmpshell.dll wmsgapi.dll WMSPDMOD.DLL WMSPDMOE.DLL WMVCORE.DLL WMVDECOD.DLL wmvdspa.dll WMVENCOD.DLL WMVSDECD.DLL WMVSENCD.DLL WMVXENCD.DLL WofTasks > > .dll WofUtil.dll WordBreakers.dll WorkFolders.exe WorkfoldersControl.dll WorkFoldersGPExt.dll WorkFoldersRes.dll WorkFoldersShell.dll workfolderssvc.dll wosc.dll wow64.dll wow64cpu.dll wow64win.dll wowreg32.exe WpAXHolder.dll wpbcreds.dll Wpc.dll WpcApi.dll wpcatltoa > > st.png WpcDesktopMonSvc.dll WpcMon.exe wpcmon.png WpcProxyStubs.dll WpcRefreshTask.dll WpcTok.exe WpcWebFilter.dll wpdbusenum.dll WpdMtp.dll WpdMtpUS.dll wpdshext.dll WPDShextAutoplay.exe WPDShServiceObj.dll WPDSp.dll wpd_ci.dll wpnapps.dll wpnclient.dll wpncore.dll > > wpninprc.dll wpnpinst.exe wpnprv.dll wpnservice.dll wpnsruprov.dll WpnUserService.dll WpPortingLibrary.dll WppRecorderUM.dll wpr.config.xml wpr.exe WPTaskScheduler.dll wpx.dll write.exe ws2help.dll ws2_32.dll wscadminui.exe wscapi.dll wscinterop.dll wscisvif.dll WSCl > > ient.dll WSCollect.exe wscproxystub.dll wscript.exe wscsvc.dll wscui.cpl WSDApi.dll wsdchngr.dll WSDPrintProxy.DLL WsdProviderUtil.dll WSDScanProxy.dll wsecedit.dll wsepno.dll wshbth.dll wshcon.dll wshelper.dll wshext.dll wshhyperv.dll wship6.dll wshom.ocx wshqos.dll > > wshrm.dll WSHTCPIP.DLL wshunix.dll wsl.exe wslapi.dll WsmAgent.dll wsmanconfig_schema.xml WSManHTTPConfig.exe WSManMigrationPlugin.dll WsmAuto.dll wsmplpxy.dll wsmprovhost.exe WsmPty.xsl WsmRes.dll WsmSvc.dll WsmTxt.xsl WsmWmiPl.dll wsnmp32.dll wsock32.dll wsplib.dl > > l wsp_fs.dll wsp_health.dll wsp_sr.dll wsqmcons.exe WSReset.exe WSTPager.ax wtsapi32.dll wuapi.dll wuapihost.exe wuauclt.exe wuaueng.dll wuceffects.dll WUDFCoinstaller.dll WUDFCompanionHost.exe WUDFHost.exe WUDFPlatform.dll WudfSMCClassExt.dll WUDFx.dll WUDFx02000.dl > > l wudriver.dll wups.dll wups2.dll wusa.exe wuuhext.dll wuuhosdeployment.dll wvc.dll WwaApi.dll WwaExt.dll WWAHost.exe WWanAPI.dll wwancfg.dll wwanconn.dll WWanHC.dll wwanmm.dll Wwanpref.dll wwanprotdim.dll WwanRadioManager.dll wwansvc.dll wwapi.dll XamlTileRender.dll XAudio2_8.dll XAudio2_9.dll XblAuthManager.dll XblAuthManagerProxy.dll XblAuthTokenBrokerExt.dll XblGameSave.dll XblGameSaveExt.dll XblGameSaveProxy.dll XblGameSaveTask.exe XboxGipRadioManager.dll xboxgipsvc.dll xboxgipsynthetic.dll XboxNetApiSvc.dll xcopy.exe XInput1_4.dll XInput9_1_0.dll XInputUap.dll xmlfilter.dll xmllite.dll xmlprovi.dll xolehlp.dll XpsDocumentTargetPrint.dll XpsGdiConverter.dll XpsPrint.dll xpspushlayer.dll XpsRasterService.dll xpsservices.dll XpsToPclmConverter.dll XpsToPwgrConverter.dll xwizard.dtd xwizard.exe xwizards.dll xwreg.dll xwtpdui.dll xwtpw32.dll X_80.contrast-black.png X_80.contrast-white.png X_80.png ze_loader.dll ze_tracing_layer.dll ze_validation_layer.dll zh-CN zh-TW zip containe r.dll zipfldr.dll ztrace_maps.dll > > /cygdrive/c/Windows: addins AhnInst.log appcompat Application Data apppatch AppReadiness assembly bcastdvr bfsvc.exe BitLockerDiscoveryVolumeContents Boot bootstat.dat Branding CbsTemp Containers CSC Cursors debug diagnostics DiagTrack DigitalLocker Downloaded > > Program Files DtcInstall.log ELAMBKUP en-US explorer.exe Fonts GameBarPresenceWriter gethelp_audiotroubleshooter_latestpackage.zip Globalization Help HelpPane.exe hh.exe hipiw.dll IdentityCRL ImageSAFERSvc.exe IME IMGSF50Svc.exe ImmersiveControlPanel INF InputMethod > > Installer ko-KR L2Schemas LanguageOverlayCache LiveKernelReports Logs lsasetup.log Media mib.bin Microsoft.NET Migration ModemLogs notepad.exe OCR Offline Web Pages Panther Performance PFRO.log PLA PolicyDefinitions Prefetch PrintDialog Professional.xml Provisioning > > regedit.exe Registration RemotePackages rescache Resources RtlExUpd.dll SchCache schemas security ServiceProfiles ServiceState servicing Setup setupact.log setuperr.log ShellComponents ShellExperiences SHELLNEW SKB SoftwareDistribution Speech Speech_OneCore splwow64. > > exe System system.ini System32 SystemApps SystemResources SystemTemp SysWOW64 TAPI Tasks Temp TempInst tracing twain_32 twain_32.dll Vss WaaS Web win.ini WindowsShell.Manifest WindowsUpdate.log winhlp32.exe WinSxS WMSysPr9.prx write.exe > > /cygdrive/c/Windows/System32/Wbem: aeinv.mof AgentWmi.mof AgentWmiUninstall.mof appbackgroundtask.dll appbackgroundtask.mof appbackgroundtask_uninstall.mof AuditRsop.mof authfwcfg.mof AutoRecover bcd.mof BthMtpEnum.mof cimdmtf.mof cimwin32.dll cimwin32.mof CIWm > > i.mof classlog.mof cli.mof cliegaliases.mof ddp.mof dimsjob.mof dimsroam.mof DMWmiBridgeProv.dll DMWmiBridgeProv.mof DMWmiBridgeProv1.dll DMWmiBridgeProv1.mof DMWmiBridgeProv1_Uninstall.mof DMWmiBridgeProv_Uninstall.mof dnsclientcim.dll dnsclientcim.mof dnsclientpspr > > ovider.dll dnsclientpsprovider.mof dnsclientpsprovider_Uninstall.mof drvinst.mof DscCore.mof DscCoreConfProv.mof dscproxy.mof Dscpspluginwkr.dll DscTimer.mof dsprov.dll dsprov.mof eaimeapi.mof EmbeddedLockdownWmi.dll embeddedlockdownwmi.mof embeddedlockdownwmi_Uninst > > all.mof en en-US esscli.dll EventTracingManagement.dll EventTracingManagement.mof fastprox.dll fdPHost.mof fdrespub.mof fdSSDP.mof fdWNet.mof fdWSD.mof filetrace.mof firewallapi.mof FolderRedirectionWMIProvider.mof FunDisc.mof fwcfg.mof hbaapi.mof hnetcfg.mof IMAPIv2 > > -Base.mof IMAPIv2-FileSystemSupport.mof IMAPIv2-LegacyShim.mof interop.mof IpmiDTrc.mof ipmiprr.dll ipmiprv.dll ipmiprv.mof IpmiPTrc.mof ipsecsvc.mof iscsidsc.mof iscsihba.mof iscsiprf.mof iscsirem.mof iscsiwmiv2.mof iscsiwmiv2_uninstall.mof kerberos.mof ko ko-KR Krn > > lProv.dll krnlprov.mof L2SecHC.mof lltdio.mof lltdsvc.mof Logs lsasrv.mof mblctr.mof MDMAppProv.dll MDMAppProv.mof MDMAppProv_Uninstall.mof MDMSettingsProv.dll MDMSettingsProv.mof MDMSettingsProv_Uninstall.mof Microsoft-Windows-OfflineFiles.mof Microsoft-Windows-Remo > > te-FileSystem.mof Microsoft.AppV.AppVClientWmi.dll Microsoft.AppV.AppVClientWmi.mof Microsoft.Uev.AgentWmi.dll Microsoft.Uev.ManagedAgentWmi.mof Microsoft.Uev.ManagedAgentWmiUninstall.mof mispace.mof mispace_uninstall.mof mmc.mof MMFUtil.dll MOF mofcomp.exe mofd.dll > > mofinstall.dll mountmgr.mof mpeval.mof mpsdrv.mof mpssvc.mof msdtcwmi.dll MsDtcWmi.mof msfeeds.mof msfeedsbs.mof msi.mof msiprov.dll msiscsi.mof MsNetImPlatform.mof mstsc.mof mstscax.mof msv1_0.mof mswmdm.mof NCProv.dll ncprov.mof ncsi.mof ndisimplatcim.dll ndistrace > > .mof NetAdapterCim.dll NetAdapterCim.mof NetAdapterCimTrace.mof NetAdapterCimTraceUninstall.mof NetAdapterCim_uninstall.mof netdacim.dll netdacim.mof netdacim_uninstall.mof NetEventPacketCapture.dll NetEventPacketCapture.mof NetEventPacketCapture_uninstall.mof netncc > > im.dll netnccim.mof netnccim_uninstall.mof NetPeerDistCim.dll NetPeerDistCim.mof NetPeerDistCim_uninstall.mof netprofm.mof NetSwitchTeam.mof netswitchteamcim.dll NetTCPIP.dll NetTCPIP.mof NetTCPIP_Uninstall.mof netttcim.dll netttcim.mof netttcim_uninstall.mof network > > itemfactory.mof newdev.mof nlasvc.mof nlmcim.dll nlmcim.mof nlmcim_uninstall.mof nlsvc.mof npivwmi.mof nshipsec.mof ntevt.dll ntevt.mof ntfs.mof OfflineFilesConfigurationWmiProvider.mof OfflineFilesConfigurationWmiProvider_Uninstall.mof OfflineFilesWmiProvider.mof Of > > flineFilesWmiProvider_Uninstall.mof p2p-mesh.mof p2p-pnrp.mof pcsvDevice.mof pcsvDevice_Uninstall.mof Performance PNPXAssoc.mof PolicMan.dll PolicMan.mof polproc.mof polprocl.mof polprou.mof polstore.mof portabledeviceapi.mof portabledeviceclassextension.mof portable > > deviceconnectapi.mof portabledevicetypes.mof portabledevicewiacompat.mof powermeterprovider.mof PowerPolicyProvider.mof ppcRsopCompSchema.mof ppcRsopUserSchema.mof PrintFilterPipelineSvc.mof PrintManagementProvider.dll PrintManagementProvider.mof PrintManagementProvider_Uninstall.mof profileassociationprovider.mof PS_MMAgent.mof qmgr.mof qoswmi.dll qoswmi.mof qoswmitrc.mof qoswmitrc_uninstall.mof qoswmi_uninstall.mof RacWmiProv.dll RacWmiProv.mof rawxml.xsl rdpendp.mof rdpinit.mof rdpshell.mof refs.mof refsv1.mof regevent.mof Remove.Microsoft.AppV.AppvClientWmi.mof repdrvfs.dll Repository rsop.mof rspndr.mof samsrv.mof scersop.mof schannel.mof schedprov.dll SchedProv.mof scm.mof scrcons.exe scrcons.mof sdbus.mof secrcw32.mof SensorsClassExtension.mof ServDeps.dll ServiceModel.mof ServiceModel.mof.uninstall ServiceModel35.mof ServiceModel35.mof.uninstall services.mof setupapi.mof SmbWitnessWmiv2Provider.mof smbwmiv2.mof SMTPCons.dll smtpcons.mof sppwmi.mof sr.mof sstpsvc.m of stdpr ov.dll storagewmi.mof storagewmi_passthru.mof storagewmi_passthru_uninstall.mof storagewmi_uninstall.mof stortrace.mof subscrpt.mof system.mof tcpip.mof texttable.xsl textvaluelist.xsl tmf tsallow.mof tscfgwmi.mof tsmf.mof tspkg.mof umb.mof umbus.mof umpass.mof umpnpmgr.mof unsecapp.exe UserProfileConfigurationWmiProvider.mof UserProfileWmiProvider.mof UserStateWMIProvider.mof vds.mof vdswmi.dll viewprov.dll vpnclientpsprovider.dll vpnclientpsprovider.mof vpnclientpsprovider_Uninstall.mof vss.mof vsswmi.dll wbemcntl.dll wbemcons.dll WBEMCons.mof wbemcore.dll wbemdisp.dll wbemdisp.tlb wbemess.dll wbemprox.dll wbemsvc.dll wbemtest.exe wcncsvc.mof WdacEtwProv.mof WdacWmiProv.dll WdacWmiProv.mof WdacWmiProv_Uninstall.mof Wdf01000.mof Wdf01000Uninstall.mof wdigest.mof WFAPIGP.mof wfascim.dll wfascim.mof wfascim_uninstall.mof WFP.MOF wfs.mof whqlprov.mof Win32_DeviceGuard.mof Win32_EncryptableVolume.dll win32_encryptablevolume.mof Win32_EncryptableVolumeUninstall.mof win32_printer .mof Win 32_Tpm.dll Win32_Tpm.mof wininit.mof winipsec.mof winlogon.mof WinMgmt.exe WinMgmtR.dll Winsat.mof WinsatUninstall.mof wlan.mof WLanHC.mof wmi.mof WMIADAP.exe WmiApRes.dll WmiApRpl.dll WmiApSrv.exe WMIC.exe WMICOOKR.dll WmiDcPrv.dll wmipcima.dll wmipcima.mof wmipdfs.dll wmipdfs.mof wmipdskq.dll wmipdskq.mof WmiPerfClass.dll WmiPerfClass.mof WmiPerfInst.dll WmiPerfInst.mof WMIPICMP.dll wmipicmp.mof WMIPIPRT.dll wmipiprt.mof WMIPJOBJ.dll wmipjobj.mof wmiprov.dll WmiPrvSD.dll WmiPrvSE.exe WMIPSESS.dll wmipsess.mof WMIsvc.dll wmitimep.dll wmitimep.mof wmiutils.dll WMI_Tracing.mof wmp.mof wmpnetwk.mof wpdbusenum.mof wpdcomp.mof wpdfs.mof wpdmtp.mof wpdshext.mof WPDShServiceObj.mof wpdsp.mof wpd_ci.mof wscenter.mof WsmAgent.mof WsmAgentUninstall.mof WsmAuto.mof wsp_fs.mof wsp_fs_uninstall.mof wsp_health.mof wsp_health_uninstall.mof wsp_sr.mof wsp_sr_uninstall.mof WUDFx.mof Wudfx02000.mof Wudfx02000Uninstall.mof WUDFxUninstall.mof xml xsl-mappings.xml xwizards.mof > > /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0: Certificate.format.ps1xml Diagnostics.Format.ps1xml DotNetTypes.format.ps1xml en en-US Event.Format.ps1xml Examples FileSystem.format.ps1xml getevent.types.ps1xml Help.format.ps1xml HelpV3.format.ps1xml ko ko-KR Modules powershell.exe powershell.exe.config PowerShellCore.format.ps1xml PowerShellTrace.format.ps1xml powershell_ise.exe powershell_ise.exe.config PSEvents.dll pspluginwkr.dll pwrshmsg.dll pwrshsip.dll Registry.format.ps1xml Schemas SessionConfig types.ps1xml typesv3.ps1xml WSMan.Format.ps1xml > > /cygdrive/c/Windows/System32/OpenSSH: scp.exe sftp.exe ssh-add.exe ssh-agent.exe ssh-keygen.exe ssh-keyscan.exe ssh.exe > > /cygdrive/c/Program Files/MATLAB/R2020b/bin: crash_analyzer.cfg icutzdata lcdata.xml lcdata.xsd lcdata_utf8.xml m3iregistry matlab.exe mex.bat mexext.bat util win32 win64 > > /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn: Resources SqlLocalDB.exe > > /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn: batchparser.dll bcp.exe Resources SQLCMD.EXE xmlrw.dll > > /cygdrive/c/Program Files/Git/cmd: git-gui.exe git-lfs.exe git.exe gitk.exe start-ssh-agent.cmd start-ssh-pageant.cmd > > Warning accessing /cygdrive/c/msys64/mingw64/bin gives errors: [Errno 2] No such file or directory: '/cygdrive/c/msys64/mingw64/bin' > > Warning accessing /cygdrive/c/msys64/usr/bin gives errors: [Errno 2] No such file or directory: '/cygdrive/c/msys64/usr/bin' > > /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config llvm-symbolizer.exe LocalESPC.dll Microsoft.Diagnostics.Tracing.EventSource.dll Microsoft.VisualStudio.RemoteControl.dll Microsoft.VisualStudio.Telemetry.dll Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll nmake.ex e onecor e perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll vctip.exe xdcmake.exe xdcmake.exe.config > > /cygdrive/c/Program Files/dotnet: dotnet.exe host LICENSE.txt packs sdk shared templates ThirdPartyNotices.txt > > /: bin Cygwin-Terminal.ico Cygwin.bat Cygwin.ico dev etc home lib mpich-4.0.2 mpich-4.0.2.tar.gz sbin tmp usr var proc cygdrive > > /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps: Backup GameBarElevatedFT_Alias.exe Microsoft.DesktopAppInstaller_8wekyb3d8bbwe Microsoft.MicrosoftEdge_8wekyb3d8bbwe Microsoft.SkypeApp_kzf8qxf38zg5c Microsoft.XboxGamingOverlay_8wekyb3d8bbwe MicrosoftEdge.exe python.exe python3.exe Skype.exe winget.exe > > /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin: code code.cmd > > /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config llvm-symbolizer.exe LocalESPC.dll Microsoft.Diagnostics.Tracing.EventSource.dll Microsoft.VisualStudio.RemoteControl.dll Microsoft.VisualStudio.Telemetry.dll Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll nmake.ex e onecor e perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll vctip.exe xdcmake.exe xdcmake.exe.config > > Warning accessing /cygdrive/c/Users/SEJONG/.dotnet/tools gives errors: [Errno 2] No such file or directory: '/cygdrive/c/Users/SEJONG/.dotnet/tools' > > /usr/lib/lapack: cygblas-0.dll cyglapack-0.dll > > ============================================================================================= > > TESTING: configureExternalPackagesDir from config.framework(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py:1045) > > Set alternative directory external packages are built in > > serialEvaluation: initial cxxDialectRanges ('c++11', 'c++17') > > serialEvaluation: new cxxDialectRanges ('c++11', 'c++17') > > child config.utilities.macosFirewall took 0.000005 seconds > > ============================================================================================= > > TESTING: configureDebuggers from config.utilities.debuggers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/utilities/debuggers.py:20) > > Find a default debugger and determine its arguments > > Checking for program /usr/local/bin/gdb...not found > > Checking for program /usr/bin/gdb...not found > > Checking for program /cygdrive/c/SIMULIA/Commands/gdb...not found > > Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/gdb...not found > > Checking for program /cygdrive/c/Windows/system32/gdb...not found > > Checking for program /cygdrive/c/Windows/gdb...not found > > Checking for program /cygdrive/c/Windows/System32/Wbem/gdb...not found > > Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/gdb...not found > > Checking for program /cygdrive/c/Windows/System32/OpenSSH/gdb...not found > > Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/gdb...not found > > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/gdb...not found > > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/gdb...not found > > Checking for program /cygdrive/c/Program Files/Git/cmd/gdb...not found > > Checking for program /cygdrive/c/msys64/mingw64/bin/gdb...not found > > Checking for program /cygdrive/c/msys64/usr/bin/gdb...not found > > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not found > > Checking for program /cygdrive/c/Program Files/dotnet/gdb...not found > > Checking for program /gdb...not found > > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/gdb...not found > > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/gdb...not found > > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not found > > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/gdb...not found > > Checking for program /usr/lib/lapack/gdb...not found > > Checking for program /usr/local/bin/dbx...not found > > Checking for program /usr/bin/dbx...not found > > Checking for program /cygdrive/c/SIMULIA/Commands/dbx...not found > > Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/dbx...not found > > Checking for program /cygdrive/c/Windows/system32/dbx...not found > > Checking for program /cygdrive/c/Windows/dbx...not found > > Checking for program /cygdrive/c/Windows/System32/Wbem/dbx...not found > > Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/dbx...not found > > Checking for program /cygdrive/c/Windows/System32/OpenSSH/dbx...not found > > Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/dbx...not found > > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/dbx...not found > > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/dbx...not found > > Checking for program /cygdrive/c/Program Files/Git/cmd/dbx...not found > > Checking for program /cygdrive/c/msys64/mingw64/bin/dbx...not found > > Checking for program /cygdrive/c/msys64/usr/bin/dbx...not found > > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not found > > Checking for program /cygdrive/c/Program Files/dotnet/dbx...not found > > Checking for program /dbx...not found > > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/dbx...not found > > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/dbx...not found > > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not found > > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/dbx...not found > > Checking for program /usr/lib/lapack/dbx...not found > > Defined make macro "DSYMUTIL" to "true" > > child config.utilities.debuggers took 0.014310 seconds > > ============================================================================================= > > TESTING: configureDirectories from PETSc.options.petscdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscdir.py:22) > > Checks PETSC_DIR and sets if not set > > PETSC_VERSION_RELEASE of 1 indicates the code is from a release branch or a branch created from a release branch. > > Version Information: > > #define PETSC_VERSION_RELEASE 1 > > #define PETSC_VERSION_MAJOR 3 > > #define PETSC_VERSION_MINOR 18 > > #define PETSC_VERSION_SUBMINOR 1 > > #define PETSC_VERSION_DATE "Oct 26, 2022" > > #define PETSC_VERSION_GIT "v3.18.1" > > #define PETSC_VERSION_DATE_GIT "2022-10-26 07:57:29 -0500" > > #define PETSC_VERSION_EQ(MAJOR,MINOR,SUBMINOR) \ > > #define PETSC_VERSION_ PETSC_VERSION_EQ > > #define PETSC_VERSION_LT(MAJOR,MINOR,SUBMINOR) \ > > #define PETSC_VERSION_LE(MAJOR,MINOR,SUBMINOR) \ > > #define PETSC_VERSION_GT(MAJOR,MINOR,SUBMINOR) \ > > #define PETSC_VERSION_GE(MAJOR,MINOR,SUBMINOR) \ > > child PETSc.options.petscdir took 0.015510 seconds > > ============================================================================================= > > TESTING: getDatafilespath from PETSc.options.dataFilesPath(/home/SEJONG/petsc-3.18.1/config/PETSc/options/dataFilesPath.py:29) > > Checks what DATAFILESPATH should be > > child PETSc.options.dataFilesPath took 0.002462 seconds > > ============================================================================================= > > TESTING: configureGit from config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:24) > > Find the Git executable > > Checking for program /usr/local/bin/git...not found > > Checking for program /usr/bin/git...found > > Defined make macro "GIT" to "git" > > Executing: git --version > > stdout: git version 2.38.1 > > ============================================================================================= > > TESTING: configureMercurial from config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:35) > > Find the Mercurial executable > > Checking for program /usr/local/bin/hg...not found > > Checking for program /usr/bin/hg...not found > > Checking for program /cygdrive/c/SIMULIA/Commands/hg...not found > > Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/hg...not found > > Checking for program /cygdrive/c/Windows/system32/hg...not found > > Checking for program /cygdrive/c/Windows/hg...not found > > Checking for program /cygdrive/c/Windows/System32/Wbem/hg...not found > > Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/hg...not found > > Checking for program /cygdrive/c/Windows/System32/OpenSSH/hg...not found > > Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/hg...not found > > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/hg...not found > > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/hg...not found > > Checking for program /cygdrive/c/Program Files/Git/cmd/hg...not found > > Checking for program /cygdrive/c/msys64/mingw64/bin/hg...not found > > Checking for program /cygdrive/c/msys64/usr/bin/hg...not found > > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not found > > Checking for program /cygdrive/c/Program Files/dotnet/hg...not found > > Checking for program /hg...not found > > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/hg...not found > > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/hg...not found > > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not found > > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/hg...not found > > Checking for program /usr/lib/lapack/hg...not found > > Checking for program /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/hg...not found > > child config.sourceControl took 0.121914 seconds > > ============================================================================================= > > TESTING: configureInstallationMethod from PETSc.options.petscclone(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscclone.py:20) > > Determine if PETSc was obtained via git or a tarball > > This is a tarball installation > > child PETSc.options.petscclone took 0.003125 seconds > > ============================================================================================= > > TESTING: setNativeArchitecture from PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:29) > > Forms the arch as GNU's configure would form it > > ============================================================================================= > > TESTING: configureArchitecture from PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:42) > > Checks if PETSC_ARCH is set and sets it if not set > > No previous hashfile found > > Setting hashfile: arch-mswin-c-debug/lib/petsc/conf/configure-hash > > Deleting configure hash file: arch-mswin-c-debug/lib/petsc/conf/configure-hash > > Unable to delete configure hash file: arch-mswin-c-debug/lib/petsc/conf/configure-hash > > child PETSc.options.arch took 0.149094 seconds > > ============================================================================================= > > TESTING: setInstallDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:31) > > Set installDir to either prefix or if that is not set to PETSC_DIR/PETSC_ARCH > > Defined make macro "PREFIXDIR" to "/home/SEJONG/petsc-3.18.1/arch-mswin-c-debug" > > ============================================================================================= > > TESTING: saveReconfigure from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:76) > > Save the configure options in a script in PETSC_ARCH/lib/petsc/conf so the same configure may be easily re-run > > ============================================================================================= > > TESTING: cleanConfDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:68) > > Remove all the files from configuration directory for this PETSC_ARCH, from --with-clean option > > ============================================================================================= > > TESTING: configureInstallDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:52) > > Makes installDir subdirectories if it does not exist for both prefix install location and PETSc work install location > > Changed persistence directory to /home/SEJONG/petsc-3.18.1/arch-mswin-c-debug/lib/petsc/conf > > > > TESTING: restoreReconfigure from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:90) > > If --with-clean was requested but restoring the reconfigure file was requested then restore it > > child PETSc.options.installDir took 0.006476 seconds > > ============================================================================================= > > TESTING: setExternalPackagesDir from PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:15) > > Set location where external packages will be downloaded to > > ============================================================================================= > > TESTING: cleanExternalpackagesDir from PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:23) > > Remove all downloaded external packages, from --with-clean > > child PETSc.options.externalpackagesdir took 0.000990 seconds > > ============================================================================================= > > TESTING: configureCLanguage from PETSc.options.languages(/home/SEJONG/petsc-3.18.1/config/PETSc/options/languages.py:28) > > Choose whether to compile the PETSc library using a C or C++ compiler > > C language is C > > Defined "CLANGUAGE_C" to "1" > > Defined make macro "CLANGUAGE" to "C" > > child PETSc.options.languages took 0.003172 seconds > > ============================================================================================= > > TESTING: resetEnvCompilers from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2652) > > Remove compilers from the shell environment so they do not interfer with testing > > ============================================================================================= > > TESTING: checkEnvCompilers from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2669) > > Set configure compilers from the environment, from -with-environment-variables > > ============================================================================================= > > TESTING: checkMPICompilerOverride from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2622) > > Check if --with-mpi-dir is used along with CC CXX or FC compiler options. > > This usually prevents mpi compilers from being used - so issue a warning > > ============================================================================================= > > TESTING: requireMpiLdPath from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2643) > > OpenMPI wrappers require LD_LIBRARY_PATH set > > ============================================================================================= > > TESTING: checkInitialFlags from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:723) > > Initialize the compiler and linker flags > > Initialized CFLAGS to > > Initialized CFLAGS to > > Initialized LDFLAGS to > > Initialized CUDAFLAGS to > > Initialized CUDAFLAGS to > > Initialized LDFLAGS to > > Initialized HIPFLAGS to > > Initialized HIPFLAGS to > > Initialized LDFLAGS to > > Initialized SYCLFLAGS to > > Initialized SYCLFLAGS to > > Initialized LDFLAGS to > > Initialized CXXFLAGS to > > Initialized CXX_CXXFLAGS to > > Initialized LDFLAGS to > > Initialized FFLAGS to > > Initialized FFLAGS to > > Initialized LDFLAGS to > > Initialized CPPFLAGS to > > Initialized FPPFLAGS to > > Initialized CUDAPPFLAGS to -Wno-deprecated-gpu-targets > > Initialized CXXPPFLAGS to > > Initialized HIPPPFLAGS to > > Initialized SYCLPPFLAGS to > > Initialized CC_LINKER_FLAGS to [] > > Initialized CXX_LINKER_FLAGS to [] > > Initialized FC_LINKER_FLAGS to [] > > Initialized CUDAC_LINKER_FLAGS to [] > > Initialized HIPC_LINKER_FLAGS to [] > > Initialized SYCLC_LINKER_FLAGS to [] > > > > TESTING: checkCCompiler from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:1341) > > Locate a functional C compiler > > Checking for program /usr/local/bin/mpicc...not found > > Checking for program /usr/bin/mpicc...found > > Defined make macro "CC" to "mpicc" > > Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o -I/tmp/petsc-uqt11yqc/config.setCompilers /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c > > Successful compile: > > Source: > > #include "confdefs.h" > > #include "conffix.h" > > > > int main() { > > ; > > return 0; > > } > > > > Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o -I/tmp/petsc-uqt11yqc/config.setCompilers /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c > > Successful compile: > > Source: > > #include "confdefs.h" > > #include "conffix.h" > > > > int main() { > > ; > > return 0; > > } > > > > Executing: mpicc -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.exe /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o > > Possible ERROR while running linker: exit code 1 > > stderr: > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory > > collect2: error: ld returned 1 exit status > > Linker output before filtering: > > > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory > > collect2: error: ld returned 1 exit status > > : > > Linker output after filtering: > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory > > collect2: error: ld returned 1 exit status: > > Error testing C compiler: Cannot compile/link C with mpicc. > > MPI compiler wrapper mpicc failed to compile > > Executing: mpicc -show > > stdout: gcc -L/usr/lib -lmpi -lopen-rte -lopen-pal -lhwloc -levent_core -levent_pthreads -lz > > MPI compiler wrapper mpicc is likely incorrect. > > Use --with-mpi-dir to indicate an alternate MPI. > > Deleting "CC" > > ******************************************************************************* > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > > ------------------------------------------------------------------------------- > > C compiler you provided with -with-cc=mpicc cannot be found or does not work. > > Cannot compile/link C with mpicc. > > ******************************************************************************* > > File "/home/SEJONG/petsc-3.18.1/config/configure.py", line 461, in petsc_configure > > framework.configure(out = sys.stdout) > > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1412, in configure > > self.processChildren() > > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1400, in processChildren > > self.serialEvaluation(self.childGraph) > > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1375, in serialEvaluation > > child.configure() > > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 2712, in configure > > self.executeTest(self.checkCCompiler) > > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/base.py", line 138, in executeTest > > ret = test(*args,**kargs) > > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 1346, in checkCCompiler > > for compiler in self.generateCCompilerGuesses(): > > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 1274, in generateCCompilerGuesses > > raise RuntimeError('C compiler you provided with -with-cc='+self.argDB['with-cc']+' cannot be found or does not work.'+'\n'+self.mesg) > > ================================================================================ > > Finishing configure run at Tue, 01 Nov 2022 13:06:09 +0900 > > > > -----Original Message----- > > From: Satish Balay > > Sent: Tuesday, November 1, 2022 11:36 AM > > To: Mohammad Ali Yaqteen > > Cc: petsc-users > > Subject: RE: [petsc-users] PETSc Windows Installation > > > > you'll have to send configure.log for this failure > > > > Satish > > > > > > On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > > > > > I have checked the required Cygwin openmpi libraries and they are all installed. When I run ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90, it returns: > > > > > > $ ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > > > ============================================================================================= > > > Configuring PETSc to compile on your system > > > ====================================================================== > > > ======================= > > > TESTING: checkCCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > > > ---------------------------------------------------------------------- > > > --------- C compiler you provided with -with-cc=mpicc cannot be found > > > or does not work. > > > Cannot compile/link C with mpicc. > > > > > > As for the case of WSL2, I will try to install that on my PC. > > > Meanwhile, could you please look into this issue > > > > > > Thank you > > > > > > Ali > > > > > > -----Original Message----- > > > From: Satish Balay > > > Sent: Monday, October 31, 2022 10:56 PM > > > To: Satish Balay via petsc-users > > > Cc: Matthew Knepley ; Mohammad Ali Yaqteen > > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > BTW: If you have WSL2 on windows - it might be easier to build/use PETSc. > > > > > > Satish > > > > > > On Mon, 31 Oct 2022, Satish Balay via petsc-users wrote: > > > > > > > Make sure you have cygwin openmpi installed [and cywin blas/lapack] > > > > > > > > $ cygcheck -cd |grep openmpi > > > > libopenmpi-devel 4.1.2-1 > > > > libopenmpi40 4.1.2-1 > > > > libopenmpifh40 4.1.2-1 > > > > libopenmpiusef08_40 4.1.2-1 > > > > libopenmpiusetkr40 4.1.2-1 > > > > openmpi 4.1.2-1 > > > > $ cygcheck -cd |grep lapack > > > > liblapack-devel 3.10.1-1 > > > > liblapack0 3.10.1-1 > > > > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > > --download-f2cblaslapack > > > > > > > > Should be: > > > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > > > i.e [default] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 [an > > > > default cygwin blas/lapack] > > > > > > > > Satish > > > > > > > > > > > > On Mon, 31 Oct 2022, Matthew Knepley wrote: > > > > > > > > > On Mon, Oct 31, 2022 at 1:56 AM Mohammad Ali Yaqteen > > > > > > > > > > wrote: > > > > > > > > > > > Dear Satish > > > > > > > > > > > > When I configure PETSc with (./configure --with-cc=gcc > > > > > > --with-cxx=0 > > > > > > --with-fc=0 --download-f2cblaslapack) it runs as I shared > > > > > > initially which you said is not an issue anymore. But when I add > > > > > > (--download-scalapack > > > > > > --download-mumps) or configure with these later, it gives the > > > > > > following > > > > > > error: > > > > > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > > > > > > > ============================================================================================= > > > > > > Configuring PETSc to compile on your > > > > > > system > > > > > > > > > > > > ================================================================ > > > > > > == > > > > > > =========================== > > > > > > TESTING: FortranMPICheck from > > > > > > config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* > > > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > > > > > details): > > > > > > > > > > > > ---------------------------------------------------------------- > > > > > > -- > > > > > > ------------- Fortran error! mpi_init() could not be located! > > > > > > > > > > > > **************************************************************** > > > > > > ** > > > > > > ************* > > > > > > > > > > > > What could be the problem here? > > > > > > > > > > > > > > > > Without configure.log we cannot tell what went wrong. However, > > > > > from the error message, I would guess that your MPI was not built > > > > > with Fortran bindings. You need these for those packages. > > > > > > > > > > Thanks, > > > > > > > > > > Matt > > > > > > > > > > > > > > > > Your help is highly appreciated. > > > > > > > > > > > > Thank you > > > > > > Ali > > > > > > > > > > > > -----Original Message----- > > > > > > From: Satish Balay > > > > > > Sent: Saturday, October 29, 2022 2:11 PM > > > > > > To: Mohammad Ali Yaqteen > > > > > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > > > On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > > > > > > > > > > > > > I haven?t accessed PETSC or given any command of my own. I was > > > > > > > just > > > > > > installing by following the instructions. I don?t know why it is > > > > > > attaching the debugger. Although it says ?Possible error running > > > > > > C/C++ > > > > > > src/snes/tutorials/ex19 with 1 MPI process? which I think is > > > > > > indicating of missing of MPI! > > > > > > > > > > > > The diff is not smart enough to detect the extra message from > > > > > > cygwin/OpenMPI - hence it assumes there is a potential problem - > > > > > > and prints the above message. > > > > > > > > > > > > But you can assume its installed properly - and use it. > > > > > > > > > > > > Satish > > > > > > > > > > > > > > From: Matthew Knepley > > > > > > > Sent: Friday, October 28, 2022 10:31 PM > > > > > > > To: Mohammad Ali Yaqteen > > > > > > > Cc: petsc-users at mcs.anl.gov > > > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > > > > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen < > > > > > > mhyaqteen at sju.ac.kr> wrote: > > > > > > > Dear Sir, > > > > > > > > > > > > > > During the Installation of PETSc in windows, I installed > > > > > > > Cygwin and the > > > > > > required libraries as mentioned on your website: > > > > > > > [cid:image001.png at 01D8EB93.7C17E410] > > > > > > > However, when I install PETSc using the configure commands > > > > > > > present on > > > > > > the petsc website: > > > > > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > > > --download-f2cblaslapack --download-mpich > > > > > > > > > > > > > > it gives me the following error: > > > > > > > > > > > > > > [cid:image002.png at 01D8EB93.7C17E410] > > > > > > > > > > > > > > I already installed OpenMPI using Cygwin installer but it > > > > > > > still asks me > > > > > > to. When I configure without ??download-mpich? and run ?make check? > > > > > > command, it gives me the following errors: > > > > > > > > > > > > > > [cid:image003.png at 01D8EB93.7C17E410] > > > > > > > > > > > > > > Could you kindly look into this and help me with this? Your > > > > > > > prompt > > > > > > response will highly be appreciated. > > > > > > > > > > > > > > The runs look fine. > > > > > > > > > > > > > > The test should not try to attach the debugger. Do you have > > > > > > > that in the > > > > > > PETSC_OPTIONS env variable? > > > > > > > > > > > > > > Thanks, > > > > > > > > > > > > > > Matt > > > > > > > > > > > > > > Thank you! > > > > > > > Mohammad Ali > > > > > > > Researcher, Sejong University > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > What most experimenters take for granted before they begin > > > > > > > their > > > > > > experiments is infinitely more interesting than any results to > > > > > > which their experiments lead. > > > > > > > -- Norbert Wiener > > > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/< > > > > > > http://www.cse.buffalo.edu/~knepley/> > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > From carl-johan.thore at liu.se Tue Nov 1 10:42:08 2022 From: carl-johan.thore at liu.se (Carl-Johan Thore) Date: Tue, 1 Nov 2022 15:42:08 +0000 Subject: [petsc-users] KSP on GPU In-Reply-To: References: Message-ID: Thanks for the tips! The suggested settings for GAMG did not yield better results, but hypre worked well right away, giving very good convergence! A follow-up question then (I hope that's ok; and it could be related to GAMG not working, I'll check that). Once everything was running I discovered that my gradient vector dfdx which I populate via an array df obtained from VecGetArray(dfdx, &df) doesn't get filled properly; it always contains only zeros. This is not the case when I run on the CPU, and df gets filled as it should even on the GPU, suggesting that either I'm not using VecGetArray properly, or I shouldn't use it at all for GPU computations? Kind regards, Carl-Johan From: Mark Adams Sent: den 31 oktober 2022 13:30 To: Carl-Johan Thore Cc: Matthew Knepley ; Barry Smith ; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] KSP on GPU * You could try hypre or another preconditioner that you can afford, like LU or ASM, that works. * If this matrix is SPD, you want to use -fieldsplit_0_pc_gamg_esteig_ksp_type cg -fieldsplit_0_pc_gamg_esteig_ksp_max_it 10 These will give better eigen estimates, and that is important. The differences between these steimates is not too bad. There is a safety factor (1.05 is the default) that you could increase with: -fieldsplit_0_mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.1 * Finally you could try -fieldsplit_0_pc_gamg_reuse_interpolation 1, if GAMG is still not working. Use -fieldsplit_0_ksp_converged_reason and check the iteration count. And it is a good idea to check with hypre to make sure something is not going badly in terms of performance anyway. AMG is hard and hypre is a good solver. Mark On Mon, Oct 31, 2022 at 1:56 AM Carl-Johan Thore via petsc-users > wrote: The GPU supports double precision and I didn't explicitly tell PETSc to use float when compiling, so I guess it uses double? What's the easiest way to check? Barry, running -ksp_view shows that the solver options are the same for CPU and GPU. The only difference is the coarse grid solver for gamg ("the package used to perform factorization:") which is petsc for CPU and cusparse for GPU. I tried forcing the GPU to use petsc via -fieldsplit_0_mg_coarse_sub_pc_factor_mat_solver_type, but then ksp failed to converge even on the first topology optimization iteration. -ksp_view also shows differences in the eigenvalues from the Chebyshev smoother. For example, GPU: Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_2_) 1 MPI process type: chebyshev eigenvalue targets used: min 0.109245, max 1.2017 eigenvalues provided (min 0.889134, max 1.09245) with CPU: eigenvalue targets used: min 0.112623, max 1.23886 eigenvalues provided (min 0.879582, max 1.12623) But I guess such differences are expected? /Carl-Johan From: Matthew Knepley > Sent: den 30 oktober 2022 22:00 To: Barry Smith > Cc: Carl-Johan Thore >; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] KSP on GPU On Sun, Oct 30, 2022 at 3:52 PM Barry Smith > wrote: In general you should expect similar but not identical conference behavior. I suggest running with all the monitoring you can. -ksp_monitor_true_residual -fieldsplit_0_monitor_true_residual -fieldsplit_1_monitor_true_residual and compare the various convergence between the CPU and GPU. Also run with -ksp_view and check that the various solver options are the same (they should be). Is the GPU using float or double? Matt Barry On Oct 30, 2022, at 11:02 AM, Carl-Johan Thore via petsc-users > wrote: Hi, I'm solving a topology optimization problem with Stokes flow discretized by a stabilized Q1-Q0 finite element method and using BiCGStab with the fieldsplit preconditioner to solve the linear systems. The implementation is based on DMStag, runs on Ubuntu via WSL2, and works fine with PETSc-3.18.1 on multiple CPU cores and the following options for the preconditioner: -fieldsplit_0_ksp_type preonly \ -fieldsplit_0_pc_type gamg \ -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ -fieldsplit_1_ksp_type preonly \ -fieldsplit_1_pc_type jacobi However, when I enable GPU computations by adding two options - ... -dm_vec_type cuda \ -dm_mat_type aijcusparse \ -fieldsplit_0_ksp_type preonly \ -fieldsplit_0_pc_type gamg \ -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ -fieldsplit_1_ksp_type preonly \ -fieldsplit_1_pc_type jacobi - KSP still works fine the first couple of topology optimization iterations but then stops with "Linear solve did not converge due to DIVERGED_DTOL ..". My question is whether I should expect the GPU versions of the linear solvers and pre-conditioners to function exactly as their CPU counterparts (I got this impression from the documentation), in which case I've probably made some mistake in my own code, or whether there are other/additional settings or modifications I should use to run on the GPU (an NVIDIA Quadro T2000)? Kind regards, Carl-Johan -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From d.j.nolte at rug.nl Tue Nov 1 10:52:55 2022 From: d.j.nolte at rug.nl (D.J. Nolte) Date: Tue, 1 Nov 2022 16:52:55 +0100 Subject: [petsc-users] AMD vs Intel mobile CPU performance Message-ID: Hi all, I'm looking for a small laptop which I'll be using (also) for small scale PETSc (KSP & SNES) simulations. For this setting performance is not that important, but still, I wonder if the community has any experience with AMD Ryzen CPUs (specifically 5 Pro 6650U) CPUs compared to Intel i7 12th gen. Do I have to expect significant performance differences? Thanks! David -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.zampini at gmail.com Tue Nov 1 11:11:23 2022 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Tue, 1 Nov 2022 19:11:23 +0300 Subject: [petsc-users] KSP on GPU In-Reply-To: References: Message-ID: Are you calling VecRestoreArray when you are done inserting the values? On Tue, Nov 1, 2022, 18:42 Carl-Johan Thore via petsc-users < petsc-users at mcs.anl.gov> wrote: > Thanks for the tips! > > > > The suggested settings for GAMG did not yield better results, > > but hypre worked well right away, giving very good convergence! > > > > A follow-up question then (I hope that?s ok; and it could be related to > GAMG > > not working, I?ll check that). Once everything was running I discovered > that my gradient vector > > dfdx which I populate via an array df obtained from VecGetArray(dfdx, &df) > doesn?t get > > filled properly; it always contains only zeros. This is not the case when > I run on the CPU, > > and df gets filled as it should even on the GPU, suggesting that either > I?m not using > > VecGetArray properly, or I shouldn?t use it at all for GPU computations? > > > > Kind regards, > > Carl-Johan > > > > *From:* Mark Adams > *Sent:* den 31 oktober 2022 13:30 > *To:* Carl-Johan Thore > *Cc:* Matthew Knepley ; Barry Smith ; > petsc-users at mcs.anl.gov > *Subject:* Re: [petsc-users] KSP on GPU > > > > * You could try hypre or another preconditioner that you can afford, > like LU or ASM, that works. > > * If this matrix is SPD, you want to use > -fieldsplit_0_pc_gamg_esteig_ksp_type cg > -fieldsplit_0_pc_gamg_esteig_ksp_max_it 10 > > These will give better eigen estimates, and that is important. > > The differences between these steimates is not too bad. > > There is a safety factor (1.05 is the default) that you could increase > with: -fieldsplit_0_mg_levels_ksp_chebyshev_esteig 0,0.05,0,*1.1* > > * Finally you could try -fieldsplit_0_pc_gamg_reuse_interpolation 1, if > GAMG is still not working. > > > > Use -fieldsplit_0_ksp_converged_reason and check the iteration count. > > And it is a good idea to check with hypre to make sure something is not > going badly in terms of performance anyway. AMG is hard and hypre is a good > solver. > > > > Mark > > > > On Mon, Oct 31, 2022 at 1:56 AM Carl-Johan Thore via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > The GPU supports double precision and I didn?t explicitly tell PETSc to > use float when compiling, so > > I guess it uses double? What?s the easiest way to check? > > > > Barry, running -ksp_view shows that the solver options are the same for > CPU and GPU. The only > > difference is the coarse grid solver for gamg (?the package used to > perform factorization:?) which > > is petsc for CPU and cusparse for GPU. I tried forcing the GPU to use > petsc via > > -fieldsplit_0_mg_coarse_sub_pc_factor_mat_solver_type, but then ksp failed > to converge > > even on the first topology optimization iteration. > > > > -ksp_view also shows differences in the eigenvalues from the Chebyshev > smoother. For example, > > > > GPU: > > Down solver (pre-smoother) on level 2 ------------------------------- > > KSP Object: (fieldsplit_0_mg_levels_2_) 1 MPI process > > type: chebyshev > > eigenvalue targets used: min 0.109245, max 1.2017 > > eigenvalues provided (min 0.889134, max 1.09245) with > > > > CPU: > > eigenvalue targets used: min 0.112623, max 1.23886 > > eigenvalues provided (min 0.879582, max 1.12623) > > > > But I guess such differences are expected? > > > > /Carl-Johan > > > > *From:* Matthew Knepley > *Sent:* den 30 oktober 2022 22:00 > *To:* Barry Smith > *Cc:* Carl-Johan Thore ; petsc-users at mcs.anl.gov > *Subject:* Re: [petsc-users] KSP on GPU > > > > On Sun, Oct 30, 2022 at 3:52 PM Barry Smith wrote: > > > > In general you should expect similar but not identical conference > behavior. > > > > I suggest running with all the monitoring you can. > -ksp_monitor_true_residual > -fieldsplit_0_monitor_true_residual -fieldsplit_1_monitor_true_residual and > compare the various convergence between the CPU and GPU. Also run with > -ksp_view and check that the various solver options are the same (they > should be). > > > > Is the GPU using float or double? > > > > Matt > > > > Barry > > > > > > On Oct 30, 2022, at 11:02 AM, Carl-Johan Thore via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > > Hi, > > > > I'm solving a topology optimization problem with Stokes flow discretized > by a stabilized Q1-Q0 finite element method > > and using BiCGStab with the fieldsplit preconditioner to solve the linear > systems. The implementation > > is based on DMStag, runs on Ubuntu via WSL2, and works fine with > PETSc-3.18.1 on multiple CPU cores and the following > > options for the preconditioner: > > > > -fieldsplit_0_ksp_type preonly \ > > -fieldsplit_0_pc_type gamg \ > > -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ > > -fieldsplit_1_ksp_type preonly \ > > -fieldsplit_1_pc_type jacobi > > > > However, when I enable GPU computations by adding two options - > > > > ... > > -dm_vec_type cuda \ > > -dm_mat_type aijcusparse \ > > -fieldsplit_0_ksp_type preonly \ > > -fieldsplit_0_pc_type gamg \ > > -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ > > -fieldsplit_1_ksp_type preonly \ > > -fieldsplit_1_pc_type jacobi > > > > - KSP still works fine the first couple of topology optimization > iterations but then > > stops with "Linear solve did not converge due to DIVERGED_DTOL ..". > > > > My question is whether I should expect the GPU versions of the linear > solvers and pre-conditioners > > to function exactly as their CPU counterparts (I got this impression from > the documentation), > > in which case I've probably made some mistake in my own code, or whether > there are other/additional > > settings or modifications I should use to run on the GPU (an NVIDIA Quadro > T2000)? > > > > Kind regards, > > > > Carl-Johan > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From carl-johan.thore at liu.se Tue Nov 1 13:08:49 2022 From: carl-johan.thore at liu.se (Carl-Johan Thore) Date: Tue, 1 Nov 2022 18:08:49 +0000 Subject: [petsc-users] KSP on GPU In-Reply-To: References: Message-ID: Yes, I'm calling VecRestoreArray, but I realized now that I exported the vectors to Matlab before doing that. Apparently that worked anyway for the CPU, but when using the GPU it didn't. If I call VecRestoreArray before exporting then everything works fine on the GPU as well. Thanks for pointing this out! From: Stefano Zampini Sent: den 1 november 2022 17:11 To: Carl-Johan Thore Cc: Mark Adams ; PETSc users list Subject: Re: [petsc-users] KSP on GPU Are you calling VecRestoreArray when you are done inserting the values? On Tue, Nov 1, 2022, 18:42 Carl-Johan Thore via petsc-users > wrote: Thanks for the tips! The suggested settings for GAMG did not yield better results, but hypre worked well right away, giving very good convergence! A follow-up question then (I hope that's ok; and it could be related to GAMG not working, I'll check that). Once everything was running I discovered that my gradient vector dfdx which I populate via an array df obtained from VecGetArray(dfdx, &df) doesn't get filled properly; it always contains only zeros. This is not the case when I run on the CPU, and df gets filled as it should even on the GPU, suggesting that either I'm not using VecGetArray properly, or I shouldn't use it at all for GPU computations? Kind regards, Carl-Johan From: Mark Adams > Sent: den 31 oktober 2022 13:30 To: Carl-Johan Thore > Cc: Matthew Knepley >; Barry Smith >; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] KSP on GPU * You could try hypre or another preconditioner that you can afford, like LU or ASM, that works. * If this matrix is SPD, you want to use -fieldsplit_0_pc_gamg_esteig_ksp_type cg -fieldsplit_0_pc_gamg_esteig_ksp_max_it 10 These will give better eigen estimates, and that is important. The differences between these steimates is not too bad. There is a safety factor (1.05 is the default) that you could increase with: -fieldsplit_0_mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.1 * Finally you could try -fieldsplit_0_pc_gamg_reuse_interpolation 1, if GAMG is still not working. Use -fieldsplit_0_ksp_converged_reason and check the iteration count. And it is a good idea to check with hypre to make sure something is not going badly in terms of performance anyway. AMG is hard and hypre is a good solver. Mark On Mon, Oct 31, 2022 at 1:56 AM Carl-Johan Thore via petsc-users > wrote: The GPU supports double precision and I didn't explicitly tell PETSc to use float when compiling, so I guess it uses double? What's the easiest way to check? Barry, running -ksp_view shows that the solver options are the same for CPU and GPU. The only difference is the coarse grid solver for gamg ("the package used to perform factorization:") which is petsc for CPU and cusparse for GPU. I tried forcing the GPU to use petsc via -fieldsplit_0_mg_coarse_sub_pc_factor_mat_solver_type, but then ksp failed to converge even on the first topology optimization iteration. -ksp_view also shows differences in the eigenvalues from the Chebyshev smoother. For example, GPU: Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_2_) 1 MPI process type: chebyshev eigenvalue targets used: min 0.109245, max 1.2017 eigenvalues provided (min 0.889134, max 1.09245) with CPU: eigenvalue targets used: min 0.112623, max 1.23886 eigenvalues provided (min 0.879582, max 1.12623) But I guess such differences are expected? /Carl-Johan From: Matthew Knepley > Sent: den 30 oktober 2022 22:00 To: Barry Smith > Cc: Carl-Johan Thore >; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] KSP on GPU On Sun, Oct 30, 2022 at 3:52 PM Barry Smith > wrote: In general you should expect similar but not identical conference behavior. I suggest running with all the monitoring you can. -ksp_monitor_true_residual -fieldsplit_0_monitor_true_residual -fieldsplit_1_monitor_true_residual and compare the various convergence between the CPU and GPU. Also run with -ksp_view and check that the various solver options are the same (they should be). Is the GPU using float or double? Matt Barry On Oct 30, 2022, at 11:02 AM, Carl-Johan Thore via petsc-users > wrote: Hi, I'm solving a topology optimization problem with Stokes flow discretized by a stabilized Q1-Q0 finite element method and using BiCGStab with the fieldsplit preconditioner to solve the linear systems. The implementation is based on DMStag, runs on Ubuntu via WSL2, and works fine with PETSc-3.18.1 on multiple CPU cores and the following options for the preconditioner: -fieldsplit_0_ksp_type preonly \ -fieldsplit_0_pc_type gamg \ -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ -fieldsplit_1_ksp_type preonly \ -fieldsplit_1_pc_type jacobi However, when I enable GPU computations by adding two options - ... -dm_vec_type cuda \ -dm_mat_type aijcusparse \ -fieldsplit_0_ksp_type preonly \ -fieldsplit_0_pc_type gamg \ -fieldsplit_0_pc_gamg_reuse_interpolation 0 \ -fieldsplit_1_ksp_type preonly \ -fieldsplit_1_pc_type jacobi - KSP still works fine the first couple of topology optimization iterations but then stops with "Linear solve did not converge due to DIVERGED_DTOL ..". My question is whether I should expect the GPU versions of the linear solvers and pre-conditioners to function exactly as their CPU counterparts (I got this impression from the documentation), in which case I've probably made some mistake in my own code, or whether there are other/additional settings or modifications I should use to run on the GPU (an NVIDIA Quadro T2000)? Kind regards, Carl-Johan -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mhyaqteen at sju.ac.kr Tue Nov 1 18:15:15 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Tue, 1 Nov 2022 23:15:15 +0000 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2db12320-25ab-7911-4bb6-ff0195f5ffdc@mcs.anl.gov> <461d2b54-173d-95fa-6ad5-9ce81849871e@mcs.anl.gov> <8c7b16a0-f933-92fe-f54a-337bcd88455a@mcs.anl.gov> Message-ID: What if I use Codeblocks to run petsc? Would I still need to reinstall petsc or the Cygwin installation will work? Thanks Ali -----Original Message----- From: Satish Balay Sent: Wednesday, November 2, 2022 12:13 AM To: Mohammad Ali Yaqteen Cc: petsc-users Subject: Re: [petsc-users] PETSc Windows Installation If you need to use PETSc from Visual Studio - you need to follow instructions at https://petsc.org/release/install/windows/#installation-with-microsoft-intel-windows-compilers [i.e install with MS compilers/MPI - not cygwin compilers/MPI] Also check "Project Files" section on how to setup compiler env for visual studio. Note: Most external packages won't work with MS compilers. Satish On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > The above commands worked but I get an error message when I include petsc.h in Visual Studio. The error message is "Cannot open include file: 'petscconf.h': No such file or directory > > Thanks, > Ali > -----Original Message----- > From: Satish Balay > Sent: Tuesday, November 1, 2022 2:40 PM > To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: Re: [petsc-users] PETSc Windows Installation > > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory > > For some reason cygwin has broken dependencies here. Run cygwin setup and install the following pkgs. > > $ cygcheck.exe -f /usr/lib/libhwloc.dll.a /usr/lib/libevent_core.dll.a /usr/lib/libevent_pthreads.dll.a /usr/lib/libz.dll.a > libevent-devel-2.1.12-1 > libevent-devel-2.1.12-1 > libhwloc-devel-2.6.0-2 > zlib-devel-1.2.12-1 > > BTW: you can attach the file from PETSC_DIR/PETSC_ARCH/lib/petsc/conf/configure.log > > Satish > > On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > > > I am unable to attach the configure.log file. Hence. I have copied the following text after executing the command (less configure.log) in the cygwin64 > > > > Executing: uname -s > > stdout: CYGWIN_NT-10.0-19044 > > ============================================================================================= > > Configuring PETSc to compile on your system > > ============================================================================================= > > > > ================================================================================ > > ================================================================================ > > Starting configure run at Tue, 01 Nov 2022 13:06:06 +0900 > > Configure Options: --configModules=PETSc.Configure --optionsModule=config.compilerOptions --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > > Working directory: /home/SEJONG/petsc-3.18.1 > > Machine platform: > > uname_result(system='CYGWIN_NT-10.0-19044', node='DESKTOP-R1C768B', release='3.3.6-341.x86_64', version='2022-09-05 11:15 UTC', machine='x86_64') > > Python version: > > 3.9.10 (main, Jan 20 2022, 21:37:52) > > [GCC 11.2.0] > > ================================================================================ > > Environmental variables > > USERDOMAIN=DESKTOP-R1C768B > > OS=Windows_NT > > COMMONPROGRAMFILES=C:\Program Files\Common Files > > PROCESSOR_LEVEL=6 > > PSModulePath=C:\Users\SEJONG\Documents\WindowsPowerShell\Modules;C:\Program Files\WindowsPowerShell\Modules;C:\Windows\system32\WindowsPowerShell\v1.0\Modules > > CommonProgramW6432=C:\Program Files\Common Files > > CommonProgramFiles(x86)=C:\Program Files (x86)\Common Files > > LANG=en_US.UTF-8 > > TZ=Asia/Seoul > > HOSTNAME=DESKTOP-R1C768B > > PUBLIC=C:\Users\Public > > OLDPWD=/home/SEJONG > > USERNAME=SEJONG > > LOGONSERVER=\\DESKTOP-R1C768B > > PROCESSOR_ARCHITECTURE=AMD64 > > LOCALAPPDATA=C:\Users\SEJONG\AppData\Local > > COMPUTERNAME=DESKTOP-R1C768B > > USER=SEJONG > > !::=::\ > > SYSTEMDRIVE=C: > > USERPROFILE=C:\Users\SEJONG > > PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.CPL > > SYSTEMROOT=C:\Windows > > USERDOMAIN_ROAMINGPROFILE=DESKTOP-R1C768B > > OneDriveCommercial=C:\Users\SEJONG\OneDrive - Sejong University > > PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 165 Stepping 5, GenuineIntel > > GNUPLOT_LIB=C:\Program Files\gnuplot\demo;C:\Program Files\gnuplot\demo\games;C:\Program Files\gnuplot\share > > PWD=/home/SEJONG/petsc-3.18.1 > > MSMPI_BIN=C:\Program Files\Microsoft MPI\Bin\ > > HOME=/home/SEJONG > > TMP=/tmp > > OneDrive=C:\Users\SEJONG\OneDrive - Sejong University > > ZES_ENABLE_SYSMAN=1 > > !C:=C:\cygwin64\bin > > PROCESSOR_REVISION=a505 > > PROFILEREAD=true > > PROMPT=$P$G > > NUMBER_OF_PROCESSORS=16 > > ProgramW6432=C:\Program Files > > COMSPEC=C:\Windows\system32\cmd.exe > > APPDATA=C:\Users\SEJONG\AppData\Roaming > > SHELL=/bin/bash > > TERM=xterm-256color > > WINDIR=C:\Windows > > ProgramData=C:\ProgramData > > SHLVL=1 > > PRINTER=\\210.107.220.119\HP Color LaserJet Pro MFP M377 PCL 6 > > PROGRAMFILES=C:\Program Files > > ALLUSERSPROFILE=C:\ProgramData > > TEMP=/tmp > > DriverData=C:\Windows\System32\Drivers\DriverData > > SESSIONNAME=Console > > ProgramFiles(x86)=C:\Program Files (x86) > > PATH=/usr/local/bin:/usr/bin:/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program Files/Microsoft MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools:/usr/lib/lapack > > PS1=\[\e]0;\w\a\]\n\[\e[32m\]\u@\h \[\e[33m\]\w\[\e[0m\]\n\$ > > HOMEDRIVE=C: > > INFOPATH=/usr/local/info:/usr/share/info:/usr/info > > HOMEPATH=\Users\SEJONG > > ORIGINAL_PATH=/cygdrive/c/SIMULIA/Commands:/cygdrive/c/Program Files/Microsoft MPI/Bin:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Windows/System32/OpenSSH:/cygdrive/c/Program Files/MATLAB/R2020b/bin:/cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn:/cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn:/cygdrive/c/Program Files/Git/cmd:/cygdrive/c/msys64/mingw64/bin:/cygdrive/c/msys64/usr/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Program Files/dotnet:/:/cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps:/cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64:/cygdrive/c/Users/SEJONG/.dotnet/tools > > EXECIGNORE=*.dll > > _=./configure > > Files in path provided by default path > > /usr/local/bin: > > /usr/bin: addftinfo.exe addr2line.exe apropos ar.exe arch.exe as.exe ash.exe awk b2sum.exe base32.exe base64.exe basename.exe basenc.exe bash.exe bashbug bomtool.exe bunzip2.exe bzcat.exe bzcmp bzdiff bzegrep bzfgrep bzgrep bzip2.exe bzip2recover.exe bzless bzmore c++.exe c++filt.exe c89 c99 ca-legacy cal.exe captoinfo cat.exe catman.exe cc ccmake.exe chattr.exe chcon.exe chgrp.exe chmod.exe chown.exe chroot.exe chrt.exe cksum.exe clear.exe cmake.exe cmp.exe col.exe colcrt.exe colrm.exe column.exe comm.exe cp.exe cpack.exe cpp.exe csplit.exe ctest.exe cut.exe cygarchive-13.dll cygargp-0.dll cygatomic-1.dll cygattr-1.dll cygblkid-1.dll cygbrotlicommon-1.dll cygbrotlidec-1.dll cygbz2-1.dll cygcheck.exe cygcom_err-2.dll cygcrypt-2.dll cygcrypto-1.1.dll cygcurl-4.dll cygdb-5.3.dll cygdb_cxx-5.3.dll cygdb_sql-5.3.dll cygedit-0.dll cygevent-2-1-7.dll cygevent_core-2-1-7.dll cygevent_extra-2-1-7.dll cygevent_openssl-2-1-7.dll cygevent_pthreads-2-1-7.dll cygexpat-1.dll cygfdisk-1.dll cygffi-6.dll cygfido2-1..dll cygformw-10.dll cyggc-1.dll cyggcc_s-seh-1.dll cyggdbm-6.dll cyggdbm_compat-4.dll cyggfortran-5.dll cyggmp-10.dll cyggomp-1.dll cyggsasl-7.dll cyggssapi_krb5-2.dll cygguile-2.2-1.dll cyghistory7.dll cyghwloc-15.dll cygiconv-2.dll cygidn-12.dll cygidn2-0.dll cygintl-8.dll cygisl-23.dll cygjsoncpp-25.dll cygk5crypto-3.dll cygkrb5-3.dll cygkrb5support-0.dll cyglber-2-4-2.dll cyglber-2.dll cygldap-2-4-2.dll cygldap-2.dll cygldap_r-2-4-2.dll cygltdl-7.dll cyglz4-1.dll cyglzma-5.dll cyglzo2-2.dll cygmagic-1.dll cygman-2-11-0.dll cygmandb-2-11-0.dll cygmenuw-10.dll cygmpc-3.dll cygmpfr-6.dll cygmpi-40.dll cygmpi_mpifh-40.dll cygmpi_usempif08-40.dll cygmpi_usempi_ignore_tkr-40.dll cygncursesw-10.dll cygnghttp2-14.dll cygntlm-0.dll cygopen-pal-40.dll cygopen-rte-40.dll cygp11-kit-0.dll cygpanelw-10.dll cygpath.exe cygpcre2-8-0.dll cygperl5_32.dll cygpipeline-1.dll cygpkgconf-4.dll cygpopt-0.dll cygpsl-5.dll cygquadmath-0.dll cygreadline7.dll cygrhash-0.dll cygrunsrv.exe cygsasl2-3.dll cygserver-config cygsigsegv-2.dll cygsmartcols-1.dll cygsqlite3-0.dll cygssh2-1.dll cygssl-1.1.dll cygstart.exe cygstdc++-6.dll cygtasn1-6.dll cygticw-10.dll cygunistring-2.dll cyguuid-1.dll cyguv-1.dll cygwin-console-helper.exe cygwin1.dll cygxml2-2.dll cygxxhash-0.dll cygz.dll cygzstd-1.dll dash.exe date.exe dd.exe df.exe diff.exe diff3.exe dir.exe dircolors.exe dirname.exe dlltool.exe dllwrap.exe dnsdomainname domainname du.exe dumper.exe echo.exe editrights.exe egrep elfedit.exe env.exe eqn.exe eqn2graph ex expand.exe expr.exe f95 factor.exe false.exe fgrep fido2-assert.exe fido2-cred.exe fido2-token.exe file.exe find.exe flock.exe fmt.exe fold.exe g++.exe gawk-5.1.1.exe gawk.exe gcc-ar.exe gcc-nm.exe gcc-ranlib.exe gcc.exe gcov-dump.exe gcov-tool.exe gcov.exe gdiffmk gencat.exe getconf.exe getent.exe getfacl.exe getopt.exe gfortran.exe git-receive-pack.exe git-shell.exe git-upload-archive.exe git-upload-pack.exe git.exe gkill.exe gmondump.exe gprof.exe grap2graph grep.exe grn.exe grodvi.exe groff.exe grolbp.exe grolj4.exe grops.exe grotty.exe groups.exe gunzip gzexe gzip.exe head.exe hexdump.exe hostid.exe hostname.exe hpftodit.exe i686-w64-mingw32-pkg-config id.exe indxbib.exe info.exe infocmp.exe infotocap install-info.exe install.exe ipcmk.exe ipcrm.exe ipcs.exe isosize.exe join.exe kill.exe lastlog.exe ld.bfd.exe ld.exe ldd.exe ldh.exe less.exe lessecho.exe lesskey.exe lexgrog.exe libpython3.9.dll link-cygin.exe lkbib.exe ln.exe locale.exe locate.exe logger.exe login.exe logname.exe look.exe lookbib.exe ls.exe lsattr.exe lto-dump.exe lzcat lzcmp lzdiff lzegrep lzfgrep lzgrep lzless lzma lzmadec.exe lzmainfo.exe lzmore make-dummy-cert make.exe man-recode.exe man.exe mandb.exe manpath.exe mcookie.exe md5sum.exe minidumper.exe mintheme mintty.exe mkdir.exe mkfifo.exe mkgroup.exe mknod.exe mkpasswd.exe mkshortcut.exe mktemp.exe more.exe mount.exe mpic++ mpicc mpicxx mpiexec mpif77 mpif90 mpifort mpirun mv.exe namei.exe neqn nice.exe nl.exe nm.exe nohup.exe nproc.exe nroff numfmt.exe objcopy.exe objdump.exe od.exe ompi-clean ompi-server ompi_info.exe opal_wrapper.exe openssl.exe orte-clean.exe orte-info.exe orte-server.exe ortecc orted.exe orterun.exe p11-kit.exe passwd.exe paste.exe pathchk.exe pdfroff peflags.exe peflagsall perl.exe perl5.32.1.exe pfbtops.exe pg.exe pic.exe pic2graph pinky.exe pip3 pip3.9 pkg-config pkgconf.exe pldd.exe post-grohtml.exe pr.exe pre-grohtml.exe preconv.exe printenv.exe printf.exe profiler.exe ps.exe ptx.exe pwd.exe pydoc3 pydoc3.9 python python3 python3.9.exe pzstd.exe ranlib.exe readelf.exe readlink.exe readshortcut.exe realpath.exe rebase-trigger rebase.exe rebaseall rebaselst refer.exe regtool.exe rename.exe renew-dummy-cert renice.exe reset rev.exe rm.exe rmdir.exe rsync-ssl rsync.exe run.exe runcon.exe rvi rview scalar.exe scp.exe script.exe scriptreplay.exe sdiff.exe sed.exe seq.exe setfacl.exe setmetamode.exe setsid.exe sftp.exe sh.exe sha1sum.exe sha224sum.exe sha256sum.exe sha384sum.exe sha512sum.exe shred.exe shuf.exe size.exe sleep.exe slogin soelim.exe sort.exe split.exe ssh-add.exe ssh-agent.exe ssh-copy-id ssh-host-config ssh-keygen.exe ssh-keyscan.exe ssh-user-config ssh.exe ssp.exe stat.exe stdbuf.exe strace.exe strings.exe strip.exe stty.exe sum.exe sync.exe tabs.exe tac.exe tail.exe tar.exe taskset.exe tbl.exe tee.exe test.exe tfmtodit.exe tic.exe timeout.exe toe.exe touch.exe tput.exe tr.exe troff.exe true.exe truncate.exe trust.exe tset.exe tsort.exe tty.exe tzselect tzset.exe ul.exe umount.exe uname.exe unexpand.exe uniq.exe unlink.exe unlzma unxz unzstd update-ca-trust update-crypto-policies updatedb users.exe uuidgen.exe uuidparse.exe vdir.exe vi.exe view wc.exe whatis.exe whereis.exe which.exe who.exe whoami.exe windmc.exe windres.exe x86_64-pc-cygwin-c++.exe x86_64-pc-cygwin-g++.exe x86_64-pc-cygwin-gcc-11.exe x86_64-pc-cygwin-gcc-ar.exe x86_64-pc-cygwin-gcc-nm.exe x86_64-pc-cygwin-gcc-ranlib.exe x86_64-pc-cygwin-gcc.exe x86_64-pc-cygwin-gfortran.exe x86_64-pc-cygwin-pkg-config x86_64-w64-mingw32-pkg-config xargs.exe xmlcatalog.exe xmllint.exe xz.exe xzcat xzcmp xzdec.exe xzdiff xzegrep xzfgrep xzgrep xzless xzmore yes.exe zcat zcmp zdiff zdump.exe zegrep zfgrep zforce zgrep zless zmore znew zstd.exe zstdcat zstdgrep zstdless zstdmt [.exe > > /cygdrive/c/SIMULIA/Commands: abaqus.bat abq2018.bat abq_cae_open.bat abq_odb_open.bat > > /cygdrive/c/Program Files/Microsoft MPI/Bin: mpiexec.exe mpitrace.man smpd.exe > > provthrd.dll provtool.exe ProximityCommon.dll ProximityCommonPal.dll ProximityRtapiPal.dll ProximityService.dll ProximityServicePal.dll ProximityToast ProximityUxHost.exe prproc.exe prvdmofcomp.dll psapi.dll pscript.sep PSHED.DLL psisdecd.dll psisrndr.ax PSModuleDis > > coveryProvider.dll psmodulediscoveryprovider.mof PsmServiceExtHost.dll psmsrv.dll psr.exe pstask.dll pstorec.dll pt-BR pt-PT ptpprov.dll puiapi.dll puiobj.dll PushToInstall.dll pwlauncher.dll pwlauncher.exe pwrshplugin.dll pwsso.dll qappsrv.exe qasf.dll qcap.dll qdv. > > dll qdvd.dll qedit.dll qedwipes.dll qmgr.dll qprocess.exe QualityUpdateAssistant.dll quartz.dll Query.dll query.exe QuickActionsDataModel.dll quickassist.exe QuietHours.dll quser.exe qwave.dll qwinsta.exe RacEngn.dll racpldlg.dll radardt.dll radarrs.dll RADCUI.dll ra > > s rasadhlp.dll rasapi32.dll rasauto.dll rasautou.exe raschap.dll raschapext.dll rasctrnm.h rasctrs.dll rascustom.dll rasdiag.dll rasdial.exe rasdlg.dll raserver.exe rasgcw.dll rasman.dll rasmans.dll rasmbmgr.dll RasMediaManager.dll RASMM.dll rasmontr.dll rasphone.exe > > rasplap.dll rasppp.dll rastapi.dll rastls.dll rastlsext.dll RasToast rdbui.dll rdpbase.dll rdpcfgex.dll rdpclip.exe rdpcore.dll rdpcorets.dll rdpcredentialprovider.dll rdpencom.dll rdpendp.dll rdpinit.exe rdpinput.exe rdpnano.dll RdpRelayTransport.dll RdpSa.exe RdpS > > aProxy.exe RdpSaPs.dll RdpSaUacHelper.exe rdpserverbase.dll rdpsharercom.dll rdpshell.exe rdpsign.exe rdpudd.dll rdpviewerax.dll rdrleakdiag.exe RDSAppXHelper.dll rdsdwmdr.dll rdsxvmaudio.dll rdvvmtransport.dll RDXService.dll RDXTaskFactory.dll ReAgent.dll ReAgentc.e > > xe ReAgentTask.dll recdisc.exe recover.exe Recovery recovery.dll RecoveryDrive.exe refsutil.exe reg.exe regapi.dll RegCtrl.dll regedt32.exe regidle.dll regini.exe Register-CimProvider.exe regsvc.dll regsvr32.exe reguwpapi.dll ReInfo.dll rekeywiz.exe relog.exe RelPost > > .exe RemoteAppLifetimeManager.exe RemoteAppLifetimeManagerProxyStub.dll remoteaudioendpoint.dll remotepg.dll RemotePosWorker.exe remotesp.tsp RemoteSystemToastIcon.contrast-white.png RemoteSystemToastIcon.png RemoteWipeCSP.dll RemovableMediaProvisioningPlugin.dll Rem > > oveDeviceContextHandler.dll RemoveDeviceElevated.dll rendezvousSession.tlb repair-bde.exe replace.exe ReportingCSP.dll RESAMPLEDMO.DLL ResBParser.dll reset.exe reseteng.dll ResetEngine.dll ResetEngine.exe ResetEngOnline.dll resmon.exe ResourceMapper.dll ResourcePolic > > yClient.dll ResourcePolicyServer.dll ResPriHMImageList ResPriHMImageListLowCost ResPriImageList ResPriImageListLowCost RestartManager.mof RestartManagerUninstall.mof RestartNowPower_80.contrast-black.png RestartNowPower_80.contrast-white.png RestartNowPower_80.png Re > > startTonight_80.png RestartTonight_80_contrast-black.png RestartTonight_80_contrast-white.png restore resutils.dll rgb9rast.dll Ribbons.scr riched20.dll riched32.dll rilproxy.dll RjvMDMConfig.dll RMActivate.exe RMActivate_isv.exe RMActivate_ssp.exe RMActivate_ssp_isv > > .exe RMapi.dll rmclient.dll RmClient.exe RMSRoamingSecurity.dll rmttpmvscmgrsvr.exe rnr20.dll ro-RO RoamingSecurity.dll Robocopy.exe rometadata.dll RotMgr.dll ROUTE.EXE RpcEpMap.dll rpchttp.dll RpcNs4.dll rpcnsh.dll RpcPing.exe rpcrt4.dll RpcRtRemote.dll rpcss.dll rr > > installer.exe rsaenh.dll rshx32.dll rsop.msc RstMwEventLogMsg.dll RstrtMgr.dll rstrui.exe RtCOM64.dll RtDataProc64.dll rtffilt.dll RtkApi64U.dll RtkAudUService64.exe RtkCfg64.dll rtm.dll rtmcodecs.dll RTMediaFrame.dll rtmmvrortc.dll rtmpal.dll rtmpltfm.dll rtutils.dl > > l RTWorkQ.dll ru-RU RuleBasedDS.dll runas.exe rundll32.exe runexehelper.exe RunLegacyCPLElevated.exe runonce.exe RuntimeBroker.exe rwinsta.exe samcli.dll samlib.dll samsrv.dll Samsung sas.dll sbe.dll sbeio.dll sberes.dll sbservicetrigger.dll sc.exe ScanPlugin.dll sca > > nsetting.dll SCardBi.dll SCardDlg.dll SCardSvr.dll ScavengeSpace.xml scavengeui.dll ScDeviceEnum.dll scecli.dll scesrv.dll schannel.dll schedcli.dll schedsvc.dll ScheduleTime_80.contrast-black.png ScheduleTime_80.contrast-white.png ScheduleTime_80.png schtasks.exe sc > > ksp.dll scripto.dll ScriptRunner.exe scrnsave.scr scrobj.dll scrptadm.dll scrrun.dll sdbinst.exe sdchange.exe sdclt.exe sdcpl.dll SDDS.dll sdengin2.dll SDFHost.dll sdhcinst.dll sdiageng.dll sdiagnhost.exe sdiagprv.dll sdiagschd.dll sdohlp.dll sdrsvc.dll sdshext.dll S > > earch.ProtocolHandler.MAPI2.dll SearchFilterHost.exe SearchFolder.dll SearchIndexer.exe SearchProtocolHost.exe SebBackgroundManagerPolicy.dll SecConfig.efi SecEdit.exe sechost.dll secinit.exe seclogon.dll secpol.msc secproc.dll secproc_isv.dll secproc_ssp.dll secproc > > _ssp_isv.dll secur32.dll SecureAssessmentHandlers.dll SecureBootUpdates securekernel.exe SecureTimeAggregator.dll security.dll SecurityAndMaintenance.png SecurityAndMaintenance_Alert.png SecurityAndMaintenance_Error.png SecurityCenterBroker.dll SecurityCenterBrokerPS > > .dll SecurityHealthAgent.dll SecurityHealthHost.exe SecurityHealthProxyStub.dll SecurityHealthService.exe SecurityHealthSSO.dll SecurityHealthSystray.exe sedplugins.dll SEMgrPS.dll SEMgrSvc.dll sendmail.dll Sens.dll SensApi.dll SensorDataService.exe SensorPerformance > > Events.dll SensorsApi.dll SensorsClassExtension.dll SensorsCpl.dll SensorService.dll SensorsNativeApi.dll SensorsNativeApi.V2.dll SensorsUtilsV2.dll sensrsvc.dll serialui.dll services.exe services.msc ServicingUAPI.dll serwvdrv.dll SessEnv.dll sessionmsg.exe setbcdlo > > cale.dll sethc.exe SetNetworkLocation.dll SetNetworkLocationFlyout.dll SetProxyCredential.dll setspn.exe SettingMonitor.dll settings.dat SettingsEnvironment.Desktop.dll SettingsExtensibilityHandlers.dll SettingsHandlers_Accessibility.dll SettingsHandlers_AnalogShell. > > dll SettingsHandlers_AppControl.dll SettingsHandlers_AppExecutionAlias.dll SettingsHandlers_AssignedAccess.dll SettingsHandlers_Authentication.dll SettingsHandlers_BackgroundApps.dll SettingsHandlers_BatteryUsage.dll SettingsHandlers_BrowserDeclutter.dll SettingsHand > > lers_CapabilityAccess.dll SettingsHandlers_Clipboard.dll SettingsHandlers_ClosedCaptioning.dll SettingsHandlers_ContentDeliveryManager.dll SettingsHandlers_Cortana.dll SettingsHandlers_Devices.dll SettingsHandlers_Display.dll SettingsHandlers_Flights.dll SettingsHand > > lers_Fonts.dll SettingsHandlers_ForceSync.dll SettingsHandlers_Gaming.dll SettingsHandlers_Geolocation.dll SettingsHandlers_Gpu.dll SettingsHandlers_HoloLens_Environment.dll SettingsHandlers_IME.dll SettingsHandlers_InkingTypingPrivacy.dll SettingsHandlers_InputPerso > > nalization.dll SettingsHandlers_Language.dll SettingsHandlers_ManagePhone.dll SettingsHandlers_Maps.dll SettingsHandlers_Mouse.dll SettingsHandlers_Notifications.dll SettingsHandlers_nt.dll SettingsHandlers_OneCore_BatterySaver.dll SettingsHandlers_OneCore_PowerAndSl > > eep.dll SettingsHandlers_OneDriveBackup.dll SettingsHandlers_OptionalFeatures.dll SettingsHandlers_PCDisplay.dll SettingsHandlers_Pen.dll SettingsHandlers_QuickActions.dll SettingsHandlers_Region.dll SettingsHandlers_SharedExperiences_Rome.dll SettingsHandlers_SIUF.d > > ll SettingsHandlers_SpeechPrivacy.dll SettingsHandlers_Startup.dll SettingsHandlers_StorageSense.dll SettingsHandlers_Troubleshoot.dll SettingsHandlers_User.dll SettingsHandlers_UserAccount.dll SettingsHandlers_UserExperience.dll SettingsHandlers_WorkAccess.dll Setti > > ngSync.dll SettingSyncCore.dll SettingSyncDownloadHelper.dll SettingSyncHost.exe setup setupapi.dll setupcl.dll setupcl.exe setupcln.dll setupetw.dll setupugc.exe setx.exe sfc.dll sfc.exe sfc_os.dll Sgrm SgrmBroker.exe SgrmEnclave.dll SgrmEnclave_secure.dll SgrmLpac. > > exe shacct.dll shacctprofile.dll SharedPCCSP.dll SharedRealitySvc.dll ShareHost.dll sharemediacpl.dll SHCore.dll shdocvw.dll shell32.dll ShellAppRuntime.exe ShellCommonCommonProxyStub.dll ShellExperiences shellstyle.dll shfolder.dll shgina.dll ShiftJIS.uce shimeng.dl > > l shimgvw.dll shlwapi.dll shpafact.dll shrpubw.exe shsetup.dll shsvcs.dll shunimpl.dll shutdown.exe shutdownext.dll shutdownux.dll shwebsvc.dll si-lk signdrv.dll sigverif.exe SIHClient.exe sihost.exe SimAuth.dll SimCfg.dll simpdata.tlb sk-SK skci.dll sl-SI slc.dll sl > > cext.dll SleepStudy SlideToShutDown.exe slmgr slmgr.vbs slui.exe slwga.dll SmallRoom.bin SmartCardBackgroundPolicy.dll SmartcardCredentialProvider.dll SmartCardSimulator.dll smartscreen.exe smartscreenps.dll SMBHelperClass.dll smbwmiv2.dll SMI SmiEngine.dll smphost.d > > ll SmsRouterSvc.dll smss.exe SndVol.exe SndVolSSO.dll SnippingTool.exe snmpapi.dll snmptrap.exe Snooze_80.contrast-black.png Snooze_80.contrast-white.png Snooze_80.png socialapis.dll softkbd.dll softpub.dll sort.exe SortServer2003Compat.dll SortWindows61.dll SortWind > > ows62.dll SortWindows64.dll SortWindows6Compat.dll SpaceAgent.exe spacebridge.dll SpaceControl.dll spaceman.exe SpatialAudioLicenseSrv.exe SpatializerApo.dll SpatialStore.dll spbcd.dll SpeakersSystemToastIcon.contrast-white.png SpeakersSystemToastIcon.png Spectrum.ex > > e SpectrumSyncClient.dll Speech SpeechPal.dll Speech_OneCore spfileq.dll spinf.dll spmpm.dll spnet.dll spool spoolss.dll spoolsv.exe spopk.dll spp spp.dll sppc.dll sppcext.dll sppcomapi.dll sppcommdlg.dll SppExtComObj.Exe sppinst.dll sppnp.dll sppobjs.dll sppsvc.exe > > sppui sppwinob.dll sppwmi.dll spwinsat.dll spwizeng.dll spwizimg.dll spwizres.dll spwmp.dll SqlServerSpatial130.dll SqlServerSpatial150.dll sqlsrv32.dll sqlsrv32.rll sqmapi.dll sr-Latn-RS srchadmin.dll srclient.dll srcore.dll srdelayed.exe SrEvents.dll SRH.dll srhelp > > er.dll srm.dll srmclient.dll srmlib.dll srms-apr-v.dat srms-apr.dat srms.dat srmscan.dll srmshell.dll srmstormod.dll srmtrace.dll srm_ps.dll srpapi.dll SrpUxNativeSnapIn.dll srrstr.dll SrTasks.exe sru srumapi.dll srumsvc.dll srvcli.dll srvsvc.dll srwmi.dll sscore.dll > > sscoreext.dll ssdm.dll ssdpapi.dll ssdpsrv.dll sspicli.dll sspisrv.dll SSShim.dll ssText3d.scr sstpsvc.dll StartTileData.dll Startupscan.dll StateRepository.Core.dll stclient.dll stdole2.tlb stdole32.tlb sti.dll sti_ci.dll stobject.dll StorageContextHandler.dll Stor > > ageUsage.dll storagewmi.dll storagewmi_passthru.dll stordiag.exe storewuauth.dll Storprop.dll StorSvc.dll streamci.dll StringFeedbackEngine.dll StructuredQuery.dll SubRange.uce subst.exe sud.dll sv-SE SvBannerBackground.png svchost.exe svf.dll svsvc.dll SwitcherDataM > > odel.dll swprv.dll sxproxy.dll sxs.dll sxshared.dll sxssrv.dll sxsstore.dll sxstrace.exe SyncAppvPublishingServer.exe SyncAppvPublishingServer.vbs SyncCenter.dll SyncController.dll SyncHost.exe SyncHostps.dll SyncInfrastructure.dll SyncInfrastructureps.dll SyncProxy. > > dll Syncreg.dll SyncRes.dll SyncSettings.dll syncutil.dll sysclass.dll sysdm.cpl SysFxUI.dll sysmain.dll sysmon.ocx sysntfy.dll Sysprep sysprint.sep sysprtj.sep SysResetErr.exe syssetup.dll systemcpl.dll SystemEventsBrokerClient.dll SystemEventsBrokerServer.dll syste > > minfo.exe SystemPropertiesAdvanced.exe SystemPropertiesComputerName.exe SystemPropertiesDataExecutionPrevention.exe SystemPropertiesHardware.exe SystemPropertiesPerformance.exe SystemPropertiesProtection.exe SystemPropertiesRemote.exe systemreset.exe SystemResetPlatf > > orm SystemSettings.DataModel.dll SystemSettings.DeviceEncryptionHandlers.dll SystemSettings.Handlers.dll SystemSettings.SettingsExtensibility.dll SystemSettings.UserAccountsHandlers.dll SystemSettingsAdminFlows.exe SystemSettingsBroker.exe SystemSettingsRemoveDevice. > > exe SystemSettingsThresholdAdminFlowUI.dll SystemSupportInfo.dll SystemUWPLauncher.exe systray.exe t2embed.dll ta-in ta-lk Tabbtn.dll TabbtnEx.dll tabcal.exe TabletPC.cpl TabSvc.dll takeown.exe tapi3.dll tapi32.dll tapilua.dll TapiMigPlugin.dll tapiperf.dll tapisrv.d > > ll TapiSysprep.dll tapiui.dll TapiUnattend.exe tar.exe TaskApis.dll taskbarcpl.dll taskcomp.dll TaskFlowDataEngine.dll taskhostw.exe taskkill.exe tasklist.exe Taskmgr.exe Tasks taskschd.dll taskschd.msc TaskSchdPS.dll tbauth.dll tbs.dll tcblaunch.exe tcbloader.dll tc > > msetup.exe tcpbidi.xml tcpipcfg.dll tcpmib.dll tcpmon.dll tcpmon.ini tcpmonui.dll TCPSVCS.EXE tdc.ocx tdh.dll TDLMigration.dll TEEManagement64.dll telephon.cpl TelephonyInteractiveUser.dll TelephonyInteractiveUserRes.dll tellib.dll TempSignedLicenseExchangeTask.dll T > > enantRestrictionsPlugin.dll termmgr.dll termsrv.dll tetheringclient.dll tetheringconfigsp.dll TetheringIeProvider.dll TetheringMgr.dll tetheringservice.dll TetheringStation.dll TextInputFramework.dll TextInputMethodFormatter.dll TextShaping.dll th-TH themecpl.dll The > > mes.SsfDownload.ScheduledTask.dll themeservice.dll themeui.dll ThirdPartyNoticesBySHS.txt threadpoolwinrt.dll thumbcache.dll ThumbnailExtractionHost.exe ti-et tier2punctuations.dll TieringEngineProxy.dll TieringEngineService.exe TileDataRepository.dll TimeBrokerClien > > t.dll TimeBrokerServer.dll timedate.cpl TimeDateMUICallback.dll timeout.exe timesync.dll TimeSyncTask.dll TKCtrl2k64.sys TKFsAv64.sys TKFsFt64.sys TKFWFV.inf TKFWFV64.cat TKFWFV64.sys tkfwvt64.sys TKIdsVt64.sys TKPcFtCb64.sys TKPcFtCb64.sys_ TKPcFtHk64.sys TKRgAc2k64 > > .sys TKRgFtXp64.sys TKTool2k.sys TKTool2k64.sys tlscsp.dll tokenbinding.dll TokenBroker.dll TokenBrokerCookies.exe TokenBrokerUI.dll tpm.msc TpmCertResources.dll tpmcompc.dll TpmCoreProvisioning.dll TpmInit.exe TpmTasks.dll TpmTool.exe tpmvsc.dll tpmvscmgr.exe tpmvsc > > mgrsvr.exe tquery.dll tr-TR tracerpt.exe TRACERT.EXE traffic.dll TransformPPSToWlan.xslt TransformPPSToWlanCredentials.xslt TransliterationRanker.dll TransportDSA.dll tree.com trie.dll trkwks.dll TrustedSignalCredProv.dll tsbyuv.dll tscfgwmi.dll tscon.exe tsdiscon.ex > > e TSErrRedir.dll tsf3gip.dll tsgqec.dll tskill.exe tsmf.dll TSpkg.dll tspubwmi.dll TSSessionUX.dll tssrvlic.dll TSTheme.exe TsUsbGDCoInstaller.dll TsUsbRedirectionGroupPolicyExtension.dll TSWbPrxy.exe TSWorkspace.dll TsWpfWrp.exe ttdinject.exe ttdloader.dll ttdplm.dl > > l ttdrecord.dll ttdrecordcpu.dll TtlsAuth.dll TtlsCfg.dll TtlsExt.dll tttracer.exe tvratings.dll twext.dll twinapi.appcore.dll twinapi.dll twinui.appcore.dll twinui.dll twinui.pcshell.dll txflog.dll txfw32.dll typeperf.exe tzautoupdate.dll tzres.dll tzsync.exe tzsync > > res.dll tzutil.exe ubpm.dll ucmhc.dll ucrtbase.dll ucrtbased.dll ucrtbase_clr0400.dll ucrtbase_enclave.dll ucsvc.exe udhisapi.dll uDWM.dll UefiCsp.dll UevAgentPolicyGenerator.exe UevAppMonitor.exe UevAppMonitor.exe.config UevCustomActionTypes.tlb UevTemplateBaselineG > > enerator.exe UevTemplateConfigItemGenerator.exe uexfat.dll ufat.dll UiaManager.dll UIAnimation.dll UIAutomationCore.dll uicom.dll UIManagerBrokerps.dll UIMgrBroker.exe uireng.dll UIRibbon.dll UIRibbonRes.dll uk-UA ulib.dll umb.dll umdmxfrm.dll umpdc.dll umpnpmgr.dll > > umpo-overrides.dll umpo.dll umpoext.dll umpowmi.dll umrdp.dll unattend.dll unenrollhook.dll unimdm.tsp unimdmat.dll uniplat.dll Unistore.dll unlodctr.exe UNP unregmp2.exe untfs.dll UpdateAgent.dll updatecsp.dll UpdateDeploymentProvider.dll UpdateHeartbeat.dll updatep > > olicy.dll upfc.exe UpgradeResultsUI.exe upnp.dll upnpcont.exe upnphost.dll UPPrinterInstaller.exe UPPrinterInstallsCSP.dll upshared.dll uReFS.dll uReFSv1.dll ureg.dll url.dll urlmon.dll UsbCApi.dll usbceip.dll usbmon.dll usbperf.dll UsbPmApi.dll UsbSettingsHandlers.d > > ll UsbTask.dll usbui.dll user32.dll UserAccountBroker.exe UserAccountControlSettings.dll UserAccountControlSettings.exe useractivitybroker.dll usercpl.dll UserDataAccessRes.dll UserDataAccountApis.dll UserDataLanguageUtil.dll UserDataPlatformHelperUtil.dll UserDataSe > > rvice.dll UserDataTimeUtil.dll UserDataTypeHelperUtil.dll UserDeviceRegistration.dll UserDeviceRegistration.Ngc.dll userenv.dll userinit.exe userinitext.dll UserLanguageProfileCallback.dll usermgr.dll usermgrcli.dll UserMgrProxy.dll usk.rs usoapi.dll UsoClient.exe us > > ocoreps.dll usocoreworker.exe usosvc.dll usp10.dll ustprov.dll UtcDecoderHost.exe UtcManaged.dll utcutil.dll utildll.dll Utilman.exe uudf.dll UvcModel.dll uwfcfgmgmt.dll uwfcsp.dll uwfservicingapi.dll UXInit.dll uxlib.dll uxlibres.dll uxtheme.dll vac.dll VAN.dll Vaul > > t.dll VaultCDS.dll vaultcli.dll VaultCmd.exe VaultRoaming.dll vaultsvc.dll VBICodec.ax vbisurf.ax vbsapi.dll vbscript.dll vbssysprep.dll vcamp120.dll vcamp140.dll vcamp140d.dll VCardParser.dll vccorlib110.dll vccorlib120.dll vccorlib140.dll vccorlib140d.dll vcomp100. > > dll vcomp110.dll vcomp120.dll vcomp140.dll vcomp140d.dll vcruntime140.dll vcruntime140d.dll vcruntime140_1.dll vcruntime140_1d.dll vcruntime140_clr0400.dll vds.exe vdsbas.dll vdsdyn.dll vdsldr.exe vdsutil.dll vdsvd.dll vds_ps.dll verclsid.exe verifier.dll verifier.ex > > e verifiergui.exe version.dll vertdll.dll vfbasics.dll vfcompat.dll vfcuzz.dll vfluapriv.dll vfnet.dll vfntlmless.dll vfnws.dll vfprint.dll vfprintpthelper.dll vfrdvcompat.dll vfuprov.dll vfwwdm32.dll VhfUm.dll vid.dll vidcap.ax VideoHandlers.dll VIDRESZR.DLL virtdis > > k.dll VirtualMonitorManager.dll VmApplicationHealthMonitorProxy.dll vmbuspipe.dll vmdevicehost.dll vmictimeprovider.dll vmrdvcore.dll VocabRoamingHandler.dll VoiceActivationManager.dll VoipRT.dll vpnike.dll vpnikeapi.dll VpnSohDesktop.dll VPNv2CSP.dll vrfcore.dll Vsc > > MgrPS.dll vscover160.dll VSD3DWARPDebug.dll VsGraphicsCapture.dll VsGraphicsDesktopEngine.exe VsGraphicsExperiment.dll VsGraphicsHelper.dll VsGraphicsProxyStub.dll VsGraphicsRemoteEngine.exe vsjitdebugger.exe VSPerf160.dll vssadmin.exe vssapi.dll vsstrace.dll VSSVC.e > > xe vss_ps.dll vulkan-1-999-0-0-0.dll vulkan-1.dll vulkaninfo-1-999-0-0-0.exe vulkaninfo.exe w32time.dll w32tm.exe w32topl.dll WaaSAssessment.dll WaaSMedicAgent.exe WaaSMedicCapsule.dll WaaSMedicPS.dll WaaSMedicSvc.dll WABSyncProvider.dll waitfor.exe WalletBackgroundS > > erviceProxy.dll WalletProxy.dll WalletService.dll WallpaperHost.exe wavemsp.dll wbadmin.exe wbem wbemcomn.dll wbengine.exe wbiosrvc.dll wci.dll wcimage.dll wcmapi.dll wcmcsp.dll wcmsvc.dll WCN WcnApi.dll wcncsvc.dll WcnEapAuthProxy.dll WcnEapPeerProxy.dll WcnNetsh.dl > > l wcnwiz.dll wc_storage.dll wdc.dll WDI wdi.dll wdigest.dll wdmaud.drv wdscore.dll WdsUnattendTemplate.xml WEB.rs webauthn.dll WebcamUi.dll webcheck.dll WebClnt.dll webio.dll webplatstorageserver.dll WebRuntimeManager.dll webservices.dll Websocket.dll wecapi.dll wecs > > vc.dll wecutil.exe wephostsvc.dll wer.dll werconcpl.dll wercplsupport.dll werdiagcontroller.dll WerEnc.dll weretw.dll WerFault.exe WerFaultSecure.exe wermgr.exe wersvc.dll werui.dll wevtapi.dll wevtfwd.dll wevtsvc.dll wevtutil.exe wextract.exe WF.msc wfapigp.dll wfdp > > rov.dll WFDSConMgr.dll WFDSConMgrSvc.dll WfHC.dll WFS.exe WFSR.dll whealogr.dll where.exe whhelper.dll whoami.exe wiaacmgr.exe wiaaut.dll wiadefui.dll wiadss.dll WiaExtensionHost64.dll wiarpc.dll wiascanprofiles.dll wiaservc.dll wiashext.dll wiatrace.dll wiawow64.exe > > WiFiCloudStore.dll WiFiConfigSP.dll wifidatacapabilityhandler.dll WiFiDisplay.dll wifinetworkmanager.dll wifitask.exe WimBootCompress.ini wimgapi.dll wimserv.exe win32appinventorycsp.dll Win32AppSettingsProvider.dll Win32CompatibilityAppraiserCSP.dll win32k.sys win3 > > 2kbase.sys win32kfull.sys win32kns.sys win32spl.dll win32u.dll Win32_DeviceGuard.dll winbio.dll WinBioDatabase WinBioDataModel.dll WinBioDataModelOOBE.exe winbioext.dll WinBioPlugIns winbrand.dll wincorlib.dll wincredprovider.dll wincredui.dll WindowManagement.dll Wi > > ndowManagementAPI.dll Windows.AccountsControl.dll Windows.AI.MachineLearning.dll Windows.AI.MachineLearning.Preview.dll Windows.ApplicationModel.Background.SystemEventsBroker.dll Windows.ApplicationModel.Background.TimeBroker.dll Windows.ApplicationModel.Conversation > > alAgent.dll windows.applicationmodel.conversationalagent.internal.proxystub.dll windows.applicationmodel.conversationalagent.proxystub.dll Windows.ApplicationModel.Core.dll windows.applicationmodel.datatransfer.dll Windows.ApplicationModel.dll Windows.ApplicationMode > > l.LockScreen.dll Windows.ApplicationModel.Store.dll Windows.ApplicationModel.Store.Preview.DOSettings.dll Windows.ApplicationModel.Store.TestingFramework.dll Windows.ApplicationModel.Wallet.dll Windows.CloudStore.dll Windows.CloudStore.Schema.DesktopShell.dll Windows > > .CloudStore.Schema.Shell.dll Windows.Cortana.Desktop.dll Windows.Cortana.OneCore.dll Windows.Cortana.ProxyStub.dll Windows.Data.Activities.dll Windows.Data.Pdf.dll Windows.Devices.AllJoyn.dll Windows.Devices.Background.dll Windows.Devices.Background.ps.dll Windows.De > > vices.Bluetooth.dll Windows.Devices.Custom.dll Windows.Devices.Custom.ps.dll Windows.Devices.Enumeration.dll Windows.Devices.Haptics.dll Windows.Devices.HumanInterfaceDevice.dll Windows.Devices.Lights.dll Windows.Devices.LowLevel.dll Windows.Devices.Midi.dll Windows. > > Devices.Perception.dll Windows.Devices.Picker.dll Windows.Devices.PointOfService.dll Windows.Devices.Portable.dll Windows.Devices.Printers.dll Windows.Devices.Printers.Extensions.dll Windows.Devices.Radios.dll Windows.Devices.Scanners.dll Windows.Devices.Sensors.dll > > Windows.Devices.SerialCommunication.dll Windows.Devices.SmartCards.dll Windows.Devices.SmartCards.Phone.dll Windows.Devices.Usb.dll Windows.Devices.WiFi.dll Windows.Devices.WiFiDirect.dll Windows.Energy.dll Windows.FileExplorer.Common.dll Windows.Gaming.Input.dll Win > > dows.Gaming.Preview.dll Windows.Gaming.UI.GameBar.dll Windows.Gaming.XboxLive.Storage.dll Windows.Globalization.dll Windows.Globalization.Fontgroups.dll Windows.Globalization.PhoneNumberFormatting.dll Windows.Graphics.Display.BrightnessOverride.dll Windows.Graphics.D > > isplay.DisplayEnhancementOverride.dll Windows.Graphics.dll Windows.Graphics.Printing.3D.dll Windows.Graphics.Printing.dll Windows.Graphics.Printing.Workflow.dll Windows.Graphics.Printing.Workflow.Native.dll Windows.Help.Runtime.dll windows.immersiveshell.serviceprovi > > der.dll Windows.Internal.AdaptiveCards.XamlCardRenderer.dll Windows.Internal.Bluetooth.dll Windows.Internal.CapturePicker.Desktop.dll Windows.Internal.CapturePicker.dll Windows.Internal.Devices.Sensors.dll Windows.Internal.Feedback.Analog.dll Windows.Internal.Feedbac > > k.Analog.ProxyStub.dll Windows.Internal.Graphics.Display.DisplayColorManagement.dll Windows.Internal.Graphics.Display.DisplayEnhancementManagement.dll Windows.Internal.Management.dll Windows.Internal.Management.SecureAssessment.dll Windows.Internal.PlatformExtension. > > DevicePickerExperience.dll Windows.Internal.PlatformExtension.MiracastBannerExperience.dll Windows.Internal.PredictionUnit.dll Windows.Internal.Security.Attestation.DeviceAttestation.dll Windows.Internal.SecurityMitigationsBroker.dll Windows.Internal.Shell.Broker.dll > > windows.internal.shellcommon.AccountsControlExperience.dll windows.internal.shellcommon.AppResolverModal.dll Windows.Internal.ShellCommon.Broker.dll windows.internal.shellcommon.FilePickerExperienceMEM.dll Windows.Internal.ShellCommon.PrintExperience.dll windows.int > > ernal.shellcommon.shareexperience.dll windows.internal.shellcommon.TokenBrokerModal.dll Windows.Internal.Signals.dll Windows.Internal.System.UserProfile.dll Windows.Internal.Taskbar.dll Windows.Internal.UI.BioEnrollment.ProxyStub.dll Windows.Internal.UI.Logon.ProxySt > > ub.dll Windows.Internal.UI.Shell.WindowTabManager.dll Windows.Management.EnrollmentStatusTracking.ConfigProvider.dll Windows.Management.InprocObjects.dll Windows.Management.ModernDeployment.ConfigProviders.dll Windows.Management.Provisioning.ProxyStub.dll Windows.Man > > agement.SecureAssessment.CfgProvider.dll Windows.Management.SecureAssessment.Diagnostics.dll Windows.Management.Service.dll Windows.Management.Workplace.dll Windows.Management.Workplace.WorkplaceSettings.dll Windows.Media.Audio.dll Windows.Media.BackgroundMediaPlayba > > ck.dll Windows.Media.BackgroundPlayback.exe Windows.Media.Devices.dll Windows.Media.dll Windows.Media.Editing.dll Windows.Media.FaceAnalysis.dll Windows.Media.Import.dll Windows.Media.MediaControl.dll Windows.Media.MixedRealityCapture.dll Windows.Media.Ocr.dll Window > > s.Media.Playback.BackgroundMediaPlayer.dll Windows.Media.Playback.MediaPlayer.dll Windows.Media.Playback.ProxyStub.dll Windows.Media.Protection.PlayReady.dll Windows.Media.Renewal.dll Windows.Media.Speech.dll Windows.Media.Speech.UXRes.dll Windows.Media.Streaming.dll > > Windows.Media.Streaming.ps.dll Windows.Mirage.dll Windows.Mirage.Internal.Capture.Pipeline.ProxyStub.dll Windows.Mirage.Internal.dll Windows.Networking.BackgroundTransfer.BackgroundManagerPolicy.dll Windows.Networking.BackgroundTransfer.ContentPrefetchTask.dll Windo > > ws.Networking.BackgroundTransfer.dll Windows.Networking.Connectivity.dll Windows.Networking.dll Windows.Networking.HostName.dll Windows.Networking.NetworkOperators.ESim.dll Windows.Networking.NetworkOperators.HotspotAuthentication.dll Windows.Networking.Proximity.dll > > Windows.Networking.ServiceDiscovery.Dnssd.dll Windows.Networking.Sockets.PushEnabledApplication.dll Windows.Networking.UX.EapRequestHandler.dll Windows.Networking.Vpn.dll Windows.Networking.XboxLive.ProxyStub.dll Windows.Payments.dll Windows.Perception.Stub.dll Wind > > ows.Security.Authentication.Identity.Provider.dll Windows.Security.Authentication.OnlineId.dll Windows.Security.Authentication.Web.Core.dll Windows.Security.Credentials.UI.CredentialPicker.dll Windows.Security.Credentials.UI.UserConsentVerifier.dll Windows.Security.I > > ntegrity.dll Windows.Services.TargetedContent.dll Windows.SharedPC.AccountManager.dll Windows.SharedPC.CredentialProvider.dll Windows.Shell.BlueLightReduction.dll Windows.Shell.ServiceHostBuilder.dll Windows.Shell.StartLayoutPopulationEvents.dll Windows.StateReposito > > ry.dll Windows.StateRepositoryBroker.dll Windows.StateRepositoryClient.dll Windows.StateRepositoryCore.dll Windows.StateRepositoryPS.dll Windows.StateRepositoryUpgrade.dll Windows.Storage.ApplicationData.dll Windows.Storage.Compression.dll windows.storage.dll Windows > > .Storage.OneCore.dll Windows.Storage.Search.dll Windows.System.Diagnostics.dll Windows.System.Diagnostics.Telemetry.PlatformTelemetryClient.dll Windows.System.Diagnostics.TraceReporting.PlatformDiagnosticActions.dll Windows.System.Launcher.dll Windows.System.Profile. > > HardwareId.dll Windows.System.Profile.PlatformDiagnosticsAndUsageDataSettings.dll Windows.System.Profile.RetailInfo.dll Windows.System.Profile.SystemId.dll Windows.System.Profile.SystemManufacturers.dll Windows.System.RemoteDesktop.dll Windows.System.SystemManagement > > .dll Windows.System.UserDeviceAssociation.dll Windows.System.UserProfile.DiagnosticsSettings.dll Windows.UI.Accessibility.dll Windows.UI.AppDefaults.dll Windows.UI.BioFeedback.dll Windows.UI.BlockedShutdown.dll Windows.UI.Core.TextInput.dll Windows.UI.Cred.dll Window > > s.UI.CredDialogController.dll Windows.UI.dll Windows.UI.FileExplorer.dll Windows.UI.Immersive.dll Windows.UI.Input.Inking.Analysis.dll Windows.UI.Input.Inking.dll Windows.UI.Internal.Input.ExpressiveInput.dll Windows.UI.Internal.Input.ExpressiveInput.Resource.dll Win > > dows.UI.Logon.dll Windows.UI.NetworkUXController.dll Windows.UI.PicturePassword.dll Windows.UI.Search.dll Windows.UI.Shell.dll Windows.UI.Shell.Internal.AdaptiveCards.dll Windows.UI.Storage.dll Windows.UI.Xaml.Controls.dll Windows.UI.Xaml.dll Windows.UI.Xaml.InkContr > > ols.dll Windows.UI.Xaml.Maps.dll Windows.UI.Xaml.Phone.dll Windows.UI.Xaml.Resources.19h1.dll Windows.UI.Xaml.Resources.Common.dll Windows.UI.Xaml.Resources.rs1.dll Windows.UI.Xaml.Resources.rs2.dll Windows.UI.Xaml.Resources.rs3.dll Windows.UI.Xaml.Resources.rs4.dll > > Windows.UI.Xaml.Resources.rs5.dll Windows.UI.Xaml.Resources.th.dll Windows.UI.Xaml.Resources.win81.dll Windows.UI.Xaml.Resources.win8rtm.dll Windows.UI.XamlHost.dll Windows.WARP.JITService.dll Windows.WARP.JITService.exe Windows.Web.Diagnostics.dll Windows.Web.dll Wi > > ndows.Web.Http.dll WindowsActionDialog.exe WindowsCodecs.dll WindowsCodecsExt.dll WindowsCodecsRaw.dll WindowsCodecsRaw.txt WindowsDefaultHeatProcessor.dll windowsdefenderapplicationguardcsp.dll WindowsInternal.ComposableShell.ComposerFramework.dll WindowsInternal.Co > > mposableShell.DesktopHosting.dll WindowsInternal.Shell.CompUiActivation.dll WindowsIoTCsp.dll windowslivelogin.dll WindowsManagementServiceWinRt.ProxyStub.dll windowsperformancerecordercontrol.dll WindowsPowerShell WindowsSecurityIcon.png windowsudk.shellcommon.dll W > > indowsUpdateElevatedInstaller.exe winethc.dll winevt WinFax.dll winhttp.dll winhttpcom.dll WinHvEmulation.dll WinHvPlatform.dll wininet.dll wininetlui.dll wininit.exe wininitext.dll winipcfile.dll winipcsecproc.dll winipsec.dll winjson.dll Winlangdb.dll winload.efi w > > inload.exe winlogon.exe winlogonext.dll winmde.dll WinMetadata winml.dll winmm.dll winmmbase.dll winmsipc.dll WinMsoIrmProtector.dll winnlsres.dll winnsi.dll WinOpcIrmProtector.dll WinREAgent.dll winresume.efi winresume.exe winrm winrm.cmd winrm.vbs winrnr.dll winrs. > > exe winrscmd.dll winrshost.exe winrsmgr.dll winrssrv.dll WinRTNetMUAHostServer.exe WinRtTracing.dll WinSAT.exe WinSATAPI.dll WinSCard.dll WinSetupUI.dll winshfhc.dll winsku.dll winsockhc.dll winspool.drv winsqlite3.dll WINSRPC.DLL winsrv.dll winsrvext.dll winsta.dll > > WinSync.dll WinSyncMetastore.dll WinSyncProviders.dll wintrust.dll WinTypes.dll winusb.dll winver.exe WiredNetworkCSP.dll wisp.dll witnesswmiv2provider.dll wkscli.dll wkspbroker.exe wkspbrokerAx.dll wksprt.exe wksprtPS.dll wkssvc.dll wlanapi.dll wlancfg.dll WLanConn. > > dll wlandlg.dll wlanext.exe wlangpui.dll WLanHC.dll wlanhlp.dll WlanMediaManager.dll WlanMM.dll wlanmsm.dll wlanpref.dll WlanRadioManager.dll wlansec.dll wlansvc.dll wlansvcpal.dll wlanui.dll wlanutil.dll Wldap32.dll wldp.dll wlgpclnt.dll wlidcli.dll wlidcredprov.dll > > wlidfdp.dll wlidnsp.dll wlidprov.dll wlidres.dll wlidsvc.dll wlrmdr.exe WMADMOD.DLL WMADMOE.DLL WMALFXGFXDSP.dll WMASF.DLL wmcodecdspps.dll wmdmlog.dll wmdmps.dll wmdrmsdk.dll wmerror.dll wmi.dll wmiclnt.dll wmicmiplugin.dll wmidcom.dll wmidx.dll WmiMgmt.msc wmiprop > > .dll wmitomi.dll WMNetMgr.dll wmp.dll WMPDMC.exe WmpDui.dll wmpdxm.dll wmpeffects.dll WMPhoto.dll wmploc.DLL wmpps.dll wmpshell.dll wmsgapi.dll WMSPDMOD.DLL WMSPDMOE.DLL WMVCORE.DLL WMVDECOD.DLL wmvdspa.dll WMVENCOD.DLL WMVSDECD.DLL WMVSENCD.DLL WMVXENCD.DLL WofTasks > > .dll WofUtil.dll WordBreakers.dll WorkFolders.exe WorkfoldersControl.dll WorkFoldersGPExt.dll WorkFoldersRes.dll WorkFoldersShell.dll workfolderssvc.dll wosc.dll wow64.dll wow64cpu.dll wow64win.dll wowreg32.exe WpAXHolder.dll wpbcreds.dll Wpc.dll WpcApi.dll wpcatltoa > > st.png WpcDesktopMonSvc.dll WpcMon.exe wpcmon.png WpcProxyStubs.dll WpcRefreshTask.dll WpcTok.exe WpcWebFilter.dll wpdbusenum.dll WpdMtp.dll WpdMtpUS.dll wpdshext.dll WPDShextAutoplay.exe WPDShServiceObj.dll WPDSp.dll wpd_ci.dll wpnapps.dll wpnclient.dll wpncore.dll > > wpninprc.dll wpnpinst.exe wpnprv.dll wpnservice.dll wpnsruprov.dll WpnUserService.dll WpPortingLibrary.dll WppRecorderUM.dll wpr.config.xml wpr.exe WPTaskScheduler.dll wpx.dll write.exe ws2help.dll ws2_32.dll wscadminui.exe wscapi.dll wscinterop.dll wscisvif.dll WSCl > > ient.dll WSCollect.exe wscproxystub.dll wscript.exe wscsvc.dll wscui.cpl WSDApi.dll wsdchngr.dll WSDPrintProxy.DLL WsdProviderUtil.dll WSDScanProxy.dll wsecedit.dll wsepno.dll wshbth.dll wshcon.dll wshelper.dll wshext.dll wshhyperv.dll wship6.dll wshom.ocx wshqos.dll > > wshrm.dll WSHTCPIP.DLL wshunix.dll wsl.exe wslapi.dll WsmAgent.dll wsmanconfig_schema.xml WSManHTTPConfig.exe WSManMigrationPlugin.dll WsmAuto.dll wsmplpxy.dll wsmprovhost.exe WsmPty.xsl WsmRes.dll WsmSvc.dll WsmTxt.xsl WsmWmiPl.dll wsnmp32.dll wsock32.dll wsplib.dl > > l wsp_fs.dll wsp_health.dll wsp_sr.dll wsqmcons.exe WSReset.exe WSTPager.ax wtsapi32.dll wuapi.dll wuapihost.exe wuauclt.exe wuaueng.dll wuceffects.dll WUDFCoinstaller.dll WUDFCompanionHost.exe WUDFHost.exe WUDFPlatform.dll WudfSMCClassExt.dll WUDFx.dll WUDFx02000.dl > > l wudriver.dll wups.dll wups2.dll wusa.exe wuuhext.dll wuuhosdeployment.dll wvc.dll WwaApi.dll WwaExt.dll WWAHost.exe WWanAPI.dll wwancfg.dll wwanconn.dll WWanHC.dll wwanmm.dll Wwanpref.dll wwanprotdim.dll WwanRadioManager.dll wwansvc.dll wwapi.dll XamlTileRender.dll XAudio2_8.dll XAudio2_9.dll XblAuthManager.dll XblAuthManagerProxy.dll XblAuthTokenBrokerExt.dll XblGameSave.dll XblGameSaveExt.dll XblGameSaveProxy.dll XblGameSaveTask.exe XboxGipRadioManager.dll xboxgipsvc.dll xboxgipsynthetic.dll XboxNetApiSvc.dll xcopy.exe XInput1_4.dll XInput9_1_0.dll XInputUap.dll xmlfilter.dll xmllite.dll xmlprovi.dll xolehlp.dll XpsDocumentTargetPrint.dll XpsGdiConverter.dll XpsPrint.dll xpspushlayer.dll XpsRasterService.dll xpsservices.dll XpsToPclmConverter.dll XpsToPwgrConverter.dll xwizard.dtd xwizard.exe xwizards.dll xwreg.dll xwtpdui.dll xwtpw32.dll X_80.contrast-black.png X_80.contrast-white.png X_80.png ze_loader.dll ze_tracing_layer.dll ze_validation_layer.dll zh-CN zh-TW zipcontainer.dll zipfldr.dll ztrace_maps.dll > > /cygdrive/c/Windows: addins AhnInst.log appcompat Application Data apppatch AppReadiness assembly bcastdvr bfsvc.exe BitLockerDiscoveryVolumeContents Boot bootstat.dat Branding CbsTemp Containers CSC Cursors debug diagnostics DiagTrack DigitalLocker Downloaded > > Program Files DtcInstall.log ELAMBKUP en-US explorer.exe Fonts GameBarPresenceWriter gethelp_audiotroubleshooter_latestpackage.zip Globalization Help HelpPane.exe hh.exe hipiw.dll IdentityCRL ImageSAFERSvc.exe IME IMGSF50Svc.exe ImmersiveControlPanel INF InputMethod > > Installer ko-KR L2Schemas LanguageOverlayCache LiveKernelReports Logs lsasetup.log Media mib.bin Microsoft.NET Migration ModemLogs notepad.exe OCR Offline Web Pages Panther Performance PFRO.log PLA PolicyDefinitions Prefetch PrintDialog Professional.xml Provisioning > > regedit.exe Registration RemotePackages rescache Resources RtlExUpd.dll SchCache schemas security ServiceProfiles ServiceState servicing Setup setupact.log setuperr.log ShellComponents ShellExperiences SHELLNEW SKB SoftwareDistribution Speech Speech_OneCore splwow64. > > exe System system.ini System32 SystemApps SystemResources SystemTemp SysWOW64 TAPI Tasks Temp TempInst tracing twain_32 twain_32.dll Vss WaaS Web win.ini WindowsShell.Manifest WindowsUpdate.log winhlp32.exe WinSxS WMSysPr9.prx write.exe > > /cygdrive/c/Windows/System32/Wbem: aeinv.mof AgentWmi.mof AgentWmiUninstall.mof appbackgroundtask.dll appbackgroundtask.mof appbackgroundtask_uninstall.mof AuditRsop.mof authfwcfg.mof AutoRecover bcd.mof BthMtpEnum.mof cimdmtf.mof cimwin32.dll cimwin32.mof CIWm > > i.mof classlog.mof cli.mof cliegaliases.mof ddp.mof dimsjob.mof dimsroam.mof DMWmiBridgeProv.dll DMWmiBridgeProv.mof DMWmiBridgeProv1.dll DMWmiBridgeProv1.mof DMWmiBridgeProv1_Uninstall.mof DMWmiBridgeProv_Uninstall.mof dnsclientcim.dll dnsclientcim.mof dnsclientpspr > > ovider.dll dnsclientpsprovider.mof dnsclientpsprovider_Uninstall.mof drvinst.mof DscCore.mof DscCoreConfProv.mof dscproxy.mof Dscpspluginwkr.dll DscTimer.mof dsprov.dll dsprov.mof eaimeapi.mof EmbeddedLockdownWmi.dll embeddedlockdownwmi.mof embeddedlockdownwmi_Uninst > > all.mof en en-US esscli.dll EventTracingManagement.dll EventTracingManagement.mof fastprox.dll fdPHost.mof fdrespub.mof fdSSDP.mof fdWNet.mof fdWSD.mof filetrace.mof firewallapi.mof FolderRedirectionWMIProvider.mof FunDisc.mof fwcfg.mof hbaapi.mof hnetcfg.mof IMAPIv2 > > -Base.mof IMAPIv2-FileSystemSupport.mof IMAPIv2-LegacyShim.mof interop.mof IpmiDTrc.mof ipmiprr.dll ipmiprv.dll ipmiprv.mof IpmiPTrc.mof ipsecsvc.mof iscsidsc.mof iscsihba.mof iscsiprf.mof iscsirem.mof iscsiwmiv2.mof iscsiwmiv2_uninstall.mof kerberos.mof ko ko-KR Krn > > lProv.dll krnlprov.mof L2SecHC.mof lltdio.mof lltdsvc.mof Logs lsasrv.mof mblctr.mof MDMAppProv.dll MDMAppProv.mof MDMAppProv_Uninstall.mof MDMSettingsProv.dll MDMSettingsProv.mof MDMSettingsProv_Uninstall.mof Microsoft-Windows-OfflineFiles.mof Microsoft-Windows-Remo > > te-FileSystem.mof Microsoft.AppV.AppVClientWmi.dll Microsoft.AppV.AppVClientWmi.mof Microsoft.Uev.AgentWmi.dll Microsoft.Uev.ManagedAgentWmi.mof Microsoft.Uev.ManagedAgentWmiUninstall.mof mispace.mof mispace_uninstall.mof mmc.mof MMFUtil.dll MOF mofcomp.exe mofd.dll > > mofinstall.dll mountmgr.mof mpeval.mof mpsdrv.mof mpssvc.mof msdtcwmi.dll MsDtcWmi.mof msfeeds.mof msfeedsbs.mof msi.mof msiprov.dll msiscsi.mof MsNetImPlatform.mof mstsc.mof mstscax.mof msv1_0.mof mswmdm.mof NCProv.dll ncprov.mof ncsi.mof ndisimplatcim.dll ndistrace > > .mof NetAdapterCim.dll NetAdapterCim.mof NetAdapterCimTrace.mof NetAdapterCimTraceUninstall.mof NetAdapterCim_uninstall.mof netdacim.dll netdacim.mof netdacim_uninstall.mof NetEventPacketCapture.dll NetEventPacketCapture.mof NetEventPacketCapture_uninstall.mof netncc > > im.dll netnccim.mof netnccim_uninstall.mof NetPeerDistCim.dll NetPeerDistCim.mof NetPeerDistCim_uninstall.mof netprofm.mof NetSwitchTeam.mof netswitchteamcim.dll NetTCPIP.dll NetTCPIP.mof NetTCPIP_Uninstall.mof netttcim.dll netttcim.mof netttcim_uninstall.mof network > > itemfactory.mof newdev.mof nlasvc.mof nlmcim.dll nlmcim.mof nlmcim_uninstall.mof nlsvc.mof npivwmi.mof nshipsec.mof ntevt.dll ntevt.mof ntfs.mof OfflineFilesConfigurationWmiProvider.mof OfflineFilesConfigurationWmiProvider_Uninstall.mof OfflineFilesWmiProvider.mof Of > > flineFilesWmiProvider_Uninstall.mof p2p-mesh.mof p2p-pnrp.mof pcsvDevice.mof pcsvDevice_Uninstall.mof Performance PNPXAssoc.mof PolicMan.dll PolicMan.mof polproc.mof polprocl.mof polprou.mof polstore.mof portabledeviceapi.mof portabledeviceclassextension.mof portable > > deviceconnectapi.mof portabledevicetypes.mof portabledevicewiacompat.mof powermeterprovider.mof PowerPolicyProvider.mof ppcRsopCompSchema.mof ppcRsopUserSchema.mof PrintFilterPipelineSvc.mof PrintManagementProvider.dll PrintManagementProvider.mof PrintManagementProvider_Uninstall.mof profileassociationprovider.mof PS_MMAgent.mof qmgr.mof qoswmi.dll qoswmi.mof qoswmitrc.mof qoswmitrc_uninstall.mof qoswmi_uninstall.mof RacWmiProv.dll RacWmiProv.mof rawxml.xsl rdpendp.mof rdpinit.mof rdpshell.mof refs.mof refsv1.mof regevent.mof Remove.Microsoft.AppV.AppvClientWmi.mof repdrvfs.dll Repository rsop.mof rspndr.mof samsrv.mof scersop.mof schannel.mof schedprov.dll SchedProv.mof scm.mof scrcons.exe scrcons.mof sdbus.mof secrcw32.mof SensorsClassExtension.mof ServDeps.dll ServiceModel.mof ServiceModel.mof.uninstall ServiceModel35.mof ServiceModel35.mof.uninstall services.mof setupapi.mof SmbWitnessWmiv2Provider.mof smbwmiv2.mof SMTPCons.dll smtpcons.mof sppwmi.mof sr.mof sstpsvc.mof stdprov.dll storagewmi.mof storagewmi_passthru.mof storagewmi_passthru_uninstall.mof storagewmi_uninstall.mof stortrace.mof subscrpt.mof system.mof tcpip.mof texttable.xsl textvaluelist.xsl tmf tsallow.mof tscfgwmi.mof tsmf.mof tspkg.mof umb.mof umbus.mof umpass.mof umpnpmgr.mof unsecapp.exe UserProfileConfigurationWmiProvider.mof UserProfileWmiProvider.mof UserStateWMIProvider.mof vds.mof vdswmi.dll viewprov.dll vpnclientpsprovider.dll vpnclientpsprovider.mof vpnclientpsprovider_Uninstall.mof vss.mof vsswmi.dll wbemcntl.dll wbemcons.dll WBEMCons.mof wbemcore.dll wbemdisp.dll wbemdisp.tlb wbemess.dll wbemprox.dll wbemsvc.dll wbemtest.exe wcncsvc.mof WdacEtwProv.mof WdacWmiProv.dll WdacWmiProv.mof WdacWmiProv_Uninstall.mof Wdf01000.mof Wdf01000Uninstall.mof wdigest.mof WFAPIGP.mof wfascim.dll wfascim.mof wfascim_uninstall.mof WFP.MOF wfs.mof whqlprov.mof Win32_DeviceGuard.mof Win32_EncryptableVolume.dll win32_encryptablevolume.mof Win32_EncryptableVolumeUninstall.mof win32_printer.mof Win32_Tpm.dll Win32_Tpm.mof wininit.mof winipsec.mof winlogon.mof WinMgmt.exe WinMgmtR.dll Winsat.mof WinsatUninstall.mof wlan.mof WLanHC.mof wmi.mof WMIADAP.exe WmiApRes.dll WmiApRpl.dll WmiApSrv.exe WMIC.exe WMICOOKR.dll WmiDcPrv.dll wmipcima.dll wmipcima.mof wmipdfs.dll wmipdfs.mof wmipdskq.dll wmipdskq.mof WmiPerfClass.dll WmiPerfClass.mof WmiPerfInst.dll WmiPerfInst.mof WMIPICMP.dll wmipicmp.mof WMIPIPRT.dll wmipiprt.mof WMIPJOBJ.dll wmipjobj.mof wmiprov.dll WmiPrvSD.dll WmiPrvSE.exe WMIPSESS.dll wmipsess.mof WMIsvc.dll wmitimep.dll wmitimep.mof wmiutils.dll WMI_Tracing.mof wmp.mof wmpnetwk.mof wpdbusenum.mof wpdcomp.mof wpdfs.mof wpdmtp.mof wpdshext.mof WPDShServiceObj.mof wpdsp.mof wpd_ci.mof wscenter.mof WsmAgent.mof WsmAgentUninstall.mof WsmAuto.mof wsp_fs.mof wsp_fs_uninstall.mof wsp_health.mof wsp_health_uninstall.mof wsp_sr.mof wsp_sr_uninstall.mof WUDFx.mof Wudfx02000.mof Wudfx02000Uninstall.mof WUDFxUninstall.mof xml xsl-mappings.xml xwizards.mof > > /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0: Certificate.format.ps1xml Diagnostics.Format.ps1xml DotNetTypes.format.ps1xml en en-US Event.Format.ps1xml Examples FileSystem.format.ps1xml getevent.types.ps1xml Help.format.ps1xml HelpV3.format.ps1xml ko ko-KR Modules powershell.exe powershell.exe.config PowerShellCore.format.ps1xml PowerShellTrace.format.ps1xml powershell_ise.exe powershell_ise.exe.config PSEvents.dll pspluginwkr.dll pwrshmsg.dll pwrshsip.dll Registry.format.ps1xml Schemas SessionConfig types.ps1xml typesv3.ps1xml WSMan.Format.ps1xml > > /cygdrive/c/Windows/System32/OpenSSH: scp.exe sftp.exe ssh-add.exe ssh-agent.exe ssh-keygen.exe ssh-keyscan.exe ssh.exe > > /cygdrive/c/Program Files/MATLAB/R2020b/bin: crash_analyzer.cfg icutzdata lcdata.xml lcdata.xsd lcdata_utf8.xml m3iregistry matlab.exe mex.bat mexext.bat util win32 win64 > > /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn: Resources SqlLocalDB.exe > > /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn: batchparser.dll bcp.exe Resources SQLCMD.EXE xmlrw.dll > > /cygdrive/c/Program Files/Git/cmd: git-gui.exe git-lfs.exe git.exe gitk.exe start-ssh-agent.cmd start-ssh-pageant.cmd > > Warning accessing /cygdrive/c/msys64/mingw64/bin gives errors: [Errno 2] No such file or directory: '/cygdrive/c/msys64/mingw64/bin' > > Warning accessing /cygdrive/c/msys64/usr/bin gives errors: [Errno 2] No such file or directory: '/cygdrive/c/msys64/usr/bin' > > /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config llvm-symbolizer.exe LocalESPC.dll Microsoft.Diagnostics.Tracing.EventSource.dll Microsoft.VisualStudio.RemoteControl.dll Microsoft.VisualStudio.Telemetry.dll Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll nmake.exe onecore perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll vctip.exe xdcmake.exe xdcmake.exe.config > > /cygdrive/c/Program Files/dotnet: dotnet.exe host LICENSE.txt packs sdk shared templates ThirdPartyNotices.txt > > /: bin Cygwin-Terminal.ico Cygwin.bat Cygwin.ico dev etc home lib mpich-4.0.2 mpich-4.0.2.tar.gz sbin tmp usr var proc cygdrive > > /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps: Backup GameBarElevatedFT_Alias.exe Microsoft.DesktopAppInstaller_8wekyb3d8bbwe Microsoft.MicrosoftEdge_8wekyb3d8bbwe Microsoft.SkypeApp_kzf8qxf38zg5c Microsoft.XboxGamingOverlay_8wekyb3d8bbwe MicrosoftEdge.exe python.exe python3.exe Skype.exe winget.exe > > /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin: code code.cmd > > /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64: 1033 asan_blacklist.txt atlprov.dll bscmake.exe c1.dll c1xx.dll c2.dll cfgpersist.dll cl.exe cl.exe.config clang_rt.asan_dbg_dynamic-x86_64.dll clang_rt.asan_dynamic-x86_64.dll ConcurrencyCheck.dll CppBuildInsights.dll CppBuildInsightsEtw.xml CppCoreCheck.dll cvtres.exe d3dcompiler_47.dll dpcmi.dll dumpbin.exe editbin.exe EnumIndex.dll EspXEngine.dll HResultCheck.dll KernelTraceControl.dll lib.exe link.exe link.exe.config llvm-symbolizer.exe LocalESPC.dll Microsoft.Diagnostics.Tracing.EventSource.dll Microsoft.VisualStudio.RemoteControl.dll Microsoft.VisualStudio.Telemetry.dll Microsoft.VisualStudio.Utilities.Internal.dll ml64.exe msobj140.dll mspdb140.dll mspdbcmf.exe mspdbcore.dll mspdbsrv.exe mspdbst.dll mspft140.dll msvcdis140.dll msvcp140.dll msvcp140_1.dll msvcp140_2.dll msvcp140_atomic_wait.dll msvcp140_codecvt_ids.dll Newtonsoft.Json.dll nmake.exe onecore perf_msvcbuildinsights.dll pgocvt.exe pgodb140.dll pgodriver.sys pgomgr.exe pgort140.dll pgosweep.exe System.Runtime.CompilerServices.Unsafe.dll tbbmalloc.dll undname.exe VariantClear.dll vcmeta.dll vcperf.exe vcruntime140.dll vcruntime140_1.dll vctip.exe xdcmake.exe xdcmake.exe.config > > Warning accessing /cygdrive/c/Users/SEJONG/.dotnet/tools gives errors: [Errno 2] No such file or directory: '/cygdrive/c/Users/SEJONG/.dotnet/tools' > > /usr/lib/lapack: cygblas-0.dll cyglapack-0.dll > > ============================================================================================= > > TESTING: configureExternalPackagesDir from config.framework(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py:1045) > > Set alternative directory external packages are built in > > serialEvaluation: initial cxxDialectRanges ('c++11', 'c++17') > > serialEvaluation: new cxxDialectRanges ('c++11', 'c++17') > > child config.utilities.macosFirewall took 0.000005 seconds > > ============================================================================================= > > TESTING: configureDebuggers from config.utilities.debuggers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/utilities/debuggers.py:20) > > Find a default debugger and determine its arguments > > Checking for program /usr/local/bin/gdb...not found > > Checking for program /usr/bin/gdb...not found > > Checking for program /cygdrive/c/SIMULIA/Commands/gdb...not found > > Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/gdb...not found > > Checking for program /cygdrive/c/Windows/system32/gdb...not found > > Checking for program /cygdrive/c/Windows/gdb...not found > > Checking for program /cygdrive/c/Windows/System32/Wbem/gdb...not found > > Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/gdb...not found > > Checking for program /cygdrive/c/Windows/System32/OpenSSH/gdb...not found > > Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/gdb...not found > > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/gdb...not found > > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/gdb...not found > > Checking for program /cygdrive/c/Program Files/Git/cmd/gdb...not found > > Checking for program /cygdrive/c/msys64/mingw64/bin/gdb...not found > > Checking for program /cygdrive/c/msys64/usr/bin/gdb...not found > > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not found > > Checking for program /cygdrive/c/Program Files/dotnet/gdb...not found > > Checking for program /gdb...not found > > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/gdb...not found > > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/gdb...not found > > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/gdb...not found > > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/gdb...not found > > Checking for program /usr/lib/lapack/gdb...not found > > Checking for program /usr/local/bin/dbx...not found > > Checking for program /usr/bin/dbx...not found > > Checking for program /cygdrive/c/SIMULIA/Commands/dbx...not found > > Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/dbx...not found > > Checking for program /cygdrive/c/Windows/system32/dbx...not found > > Checking for program /cygdrive/c/Windows/dbx...not found > > Checking for program /cygdrive/c/Windows/System32/Wbem/dbx...not found > > Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/dbx...not found > > Checking for program /cygdrive/c/Windows/System32/OpenSSH/dbx...not found > > Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/dbx...not found > > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/dbx...not found > > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/dbx...not found > > Checking for program /cygdrive/c/Program Files/Git/cmd/dbx...not found > > Checking for program /cygdrive/c/msys64/mingw64/bin/dbx...not found > > Checking for program /cygdrive/c/msys64/usr/bin/dbx...not found > > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not found > > Checking for program /cygdrive/c/Program Files/dotnet/dbx...not found > > Checking for program /dbx...not found > > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/dbx...not found > > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/dbx...not found > > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/dbx...not found > > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/dbx...not found > > Checking for program /usr/lib/lapack/dbx...not found > > Defined make macro "DSYMUTIL" to "true" > > child config.utilities.debuggers took 0.014310 seconds > > ============================================================================================= > > TESTING: configureDirectories from PETSc.options.petscdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscdir.py:22) > > Checks PETSC_DIR and sets if not set > > PETSC_VERSION_RELEASE of 1 indicates the code is from a release branch or a branch created from a release branch. > > Version Information: > > #define PETSC_VERSION_RELEASE 1 > > #define PETSC_VERSION_MAJOR 3 > > #define PETSC_VERSION_MINOR 18 > > #define PETSC_VERSION_SUBMINOR 1 > > #define PETSC_VERSION_DATE "Oct 26, 2022" > > #define PETSC_VERSION_GIT "v3.18.1" > > #define PETSC_VERSION_DATE_GIT "2022-10-26 07:57:29 -0500" > > #define PETSC_VERSION_EQ(MAJOR,MINOR,SUBMINOR) \ > > #define PETSC_VERSION_ PETSC_VERSION_EQ > > #define PETSC_VERSION_LT(MAJOR,MINOR,SUBMINOR) \ > > #define PETSC_VERSION_LE(MAJOR,MINOR,SUBMINOR) \ > > #define PETSC_VERSION_GT(MAJOR,MINOR,SUBMINOR) \ > > #define PETSC_VERSION_GE(MAJOR,MINOR,SUBMINOR) \ > > child PETSc.options.petscdir took 0.015510 seconds > > ============================================================================================= > > TESTING: getDatafilespath from PETSc.options.dataFilesPath(/home/SEJONG/petsc-3.18.1/config/PETSc/options/dataFilesPath.py:29) > > Checks what DATAFILESPATH should be > > child PETSc.options.dataFilesPath took 0.002462 seconds > > ============================================================================================= > > TESTING: configureGit from config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:24) > > Find the Git executable > > Checking for program /usr/local/bin/git...not found > > Checking for program /usr/bin/git...found > > Defined make macro "GIT" to "git" > > Executing: git --version > > stdout: git version 2.38.1 > > ============================================================================================= > > TESTING: configureMercurial from config.sourceControl(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/sourceControl.py:35) > > Find the Mercurial executable > > Checking for program /usr/local/bin/hg...not found > > Checking for program /usr/bin/hg...not found > > Checking for program /cygdrive/c/SIMULIA/Commands/hg...not found > > Checking for program /cygdrive/c/Program Files/Microsoft MPI/Bin/hg...not found > > Checking for program /cygdrive/c/Windows/system32/hg...not found > > Checking for program /cygdrive/c/Windows/hg...not found > > Checking for program /cygdrive/c/Windows/System32/Wbem/hg...not found > > Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/hg...not found > > Checking for program /cygdrive/c/Windows/System32/OpenSSH/hg...not found > > Checking for program /cygdrive/c/Program Files/MATLAB/R2020b/bin/hg...not found > > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/130/Tools/Binn/hg...not found > > Checking for program /cygdrive/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/170/Tools/Binn/hg...not found > > Checking for program /cygdrive/c/Program Files/Git/cmd/hg...not found > > Checking for program /cygdrive/c/msys64/mingw64/bin/hg...not found > > Checking for program /cygdrive/c/msys64/usr/bin/hg...not found > > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not found > > Checking for program /cygdrive/c/Program Files/dotnet/hg...not found > > Checking for program /hg...not found > > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Microsoft/WindowsApps/hg...not found > > Checking for program /cygdrive/c/Users/SEJONG/AppData/Local/Programs/Microsoft VS Code/bin/hg...not found > > Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/hg...not found > > Checking for program /cygdrive/c/Users/SEJONG/.dotnet/tools/hg...not found > > Checking for program /usr/lib/lapack/hg...not found > > Checking for program /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/hg...not found > > child config.sourceControl took 0.121914 seconds > > ============================================================================================= > > TESTING: configureInstallationMethod from PETSc.options.petscclone(/home/SEJONG/petsc-3.18.1/config/PETSc/options/petscclone.py:20) > > Determine if PETSc was obtained via git or a tarball > > This is a tarball installation > > child PETSc.options.petscclone took 0.003125 seconds > > ============================================================================================= > > TESTING: setNativeArchitecture from PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:29) > > Forms the arch as GNU's configure would form it > > ============================================================================================= > > TESTING: configureArchitecture from PETSc.options.arch(/home/SEJONG/petsc-3.18.1/config/PETSc/options/arch.py:42) > > Checks if PETSC_ARCH is set and sets it if not set > > No previous hashfile found > > Setting hashfile: arch-mswin-c-debug/lib/petsc/conf/configure-hash > > Deleting configure hash file: arch-mswin-c-debug/lib/petsc/conf/configure-hash > > Unable to delete configure hash file: arch-mswin-c-debug/lib/petsc/conf/configure-hash > > child PETSc.options.arch took 0.149094 seconds > > ============================================================================================= > > TESTING: setInstallDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:31) > > Set installDir to either prefix or if that is not set to PETSC_DIR/PETSC_ARCH > > Defined make macro "PREFIXDIR" to "/home/SEJONG/petsc-3.18.1/arch-mswin-c-debug" > > ============================================================================================= > > TESTING: saveReconfigure from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:76) > > Save the configure options in a script in PETSC_ARCH/lib/petsc/conf so the same configure may be easily re-run > > ============================================================================================= > > TESTING: cleanConfDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:68) > > Remove all the files from configuration directory for this PETSC_ARCH, from --with-clean option > > ============================================================================================= > > TESTING: configureInstallDir from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:52) > > Makes installDir subdirectories if it does not exist for both prefix install location and PETSc work install location > > Changed persistence directory to /home/SEJONG/petsc-3.18.1/arch-mswin-c-debug/lib/petsc/conf > > > > TESTING: restoreReconfigure from PETSc.options.installDir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/installDir.py:90) > > If --with-clean was requested but restoring the reconfigure file was requested then restore it > > child PETSc.options.installDir took 0.006476 seconds > > ============================================================================================= > > TESTING: setExternalPackagesDir from PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:15) > > Set location where external packages will be downloaded to > > ============================================================================================= > > TESTING: cleanExternalpackagesDir from PETSc.options.externalpackagesdir(/home/SEJONG/petsc-3.18.1/config/PETSc/options/externalpackagesdir.py:23) > > Remove all downloaded external packages, from --with-clean > > child PETSc.options.externalpackagesdir took 0.000990 seconds > > ============================================================================================= > > TESTING: configureCLanguage from PETSc.options.languages(/home/SEJONG/petsc-3.18.1/config/PETSc/options/languages.py:28) > > Choose whether to compile the PETSc library using a C or C++ compiler > > C language is C > > Defined "CLANGUAGE_C" to "1" > > Defined make macro "CLANGUAGE" to "C" > > child PETSc.options.languages took 0.003172 seconds > > ============================================================================================= > > TESTING: resetEnvCompilers from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2652) > > Remove compilers from the shell environment so they do not interfer with testing > > ============================================================================================= > > TESTING: checkEnvCompilers from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2669) > > Set configure compilers from the environment, from -with-environment-variables > > ============================================================================================= > > TESTING: checkMPICompilerOverride from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2622) > > Check if --with-mpi-dir is used along with CC CXX or FC compiler options. > > This usually prevents mpi compilers from being used - so issue a warning > > ============================================================================================= > > TESTING: requireMpiLdPath from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:2643) > > OpenMPI wrappers require LD_LIBRARY_PATH set > > ============================================================================================= > > TESTING: checkInitialFlags from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:723) > > Initialize the compiler and linker flags > > Initialized CFLAGS to > > Initialized CFLAGS to > > Initialized LDFLAGS to > > Initialized CUDAFLAGS to > > Initialized CUDAFLAGS to > > Initialized LDFLAGS to > > Initialized HIPFLAGS to > > Initialized HIPFLAGS to > > Initialized LDFLAGS to > > Initialized SYCLFLAGS to > > Initialized SYCLFLAGS to > > Initialized LDFLAGS to > > Initialized CXXFLAGS to > > Initialized CXX_CXXFLAGS to > > Initialized LDFLAGS to > > Initialized FFLAGS to > > Initialized FFLAGS to > > Initialized LDFLAGS to > > Initialized CPPFLAGS to > > Initialized FPPFLAGS to > > Initialized CUDAPPFLAGS to -Wno-deprecated-gpu-targets > > Initialized CXXPPFLAGS to > > Initialized HIPPPFLAGS to > > Initialized SYCLPPFLAGS to > > Initialized CC_LINKER_FLAGS to [] > > Initialized CXX_LINKER_FLAGS to [] > > Initialized FC_LINKER_FLAGS to [] > > Initialized CUDAC_LINKER_FLAGS to [] > > Initialized HIPC_LINKER_FLAGS to [] > > Initialized SYCLC_LINKER_FLAGS to [] > > > > TESTING: checkCCompiler from config.setCompilers(/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py:1341) > > Locate a functional C compiler > > Checking for program /usr/local/bin/mpicc...not found > > Checking for program /usr/bin/mpicc...found > > Defined make macro "CC" to "mpicc" > > Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o -I/tmp/petsc-uqt11yqc/config.setCompilers /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c > > Successful compile: > > Source: > > #include "confdefs.h" > > #include "conffix.h" > > > > int main() { > > ; > > return 0; > > } > > > > Executing: mpicc -c -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o -I/tmp/petsc-uqt11yqc/config.setCompilers /tmp/petsc-uqt11yqc/config.setCompilers/conftest.c > > Successful compile: > > Source: > > #include "confdefs.h" > > #include "conffix.h" > > > > int main() { > > ; > > return 0; > > } > > > > Executing: mpicc -o /tmp/petsc-uqt11yqc/config.setCompilers/conftest.exe /tmp/petsc-uqt11yqc/config.setCompilers/conftest.o > > Possible ERROR while running linker: exit code 1 > > stderr: > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory > > collect2: error: ld returned 1 exit status > > Linker output before filtering: > > > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory > > collect2: error: ld returned 1 exit status > > : > > Linker output after filtering: > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lhwloc: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_core: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -levent_pthreads: No such file or directory > > /usr/lib/gcc/x86_64-pc-cygwin/11/../../../../x86_64-pc-cygwin/bin/ld: cannot find -lz: No such file or directory > > collect2: error: ld returned 1 exit status: > > Error testing C compiler: Cannot compile/link C with mpicc. > > MPI compiler wrapper mpicc failed to compile > > Executing: mpicc -show > > stdout: gcc -L/usr/lib -lmpi -lopen-rte -lopen-pal -lhwloc -levent_core -levent_pthreads -lz > > MPI compiler wrapper mpicc is likely incorrect. > > Use --with-mpi-dir to indicate an alternate MPI. > > Deleting "CC" > > ******************************************************************************* > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > > ------------------------------------------------------------------------------- > > C compiler you provided with -with-cc=mpicc cannot be found or does not work. > > Cannot compile/link C with mpicc. > > ******************************************************************************* > > File "/home/SEJONG/petsc-3.18.1/config/configure.py", line 461, in petsc_configure > > framework.configure(out = sys.stdout) > > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1412, in configure > > self.processChildren() > > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1400, in processChildren > > self.serialEvaluation(self.childGraph) > > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/framework.py", line 1375, in serialEvaluation > > child.configure() > > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 2712, in configure > > self.executeTest(self.checkCCompiler) > > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/base.py", line 138, in executeTest > > ret = test(*args,**kargs) > > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 1346, in checkCCompiler > > for compiler in self.generateCCompilerGuesses(): > > File "/home/SEJONG/petsc-3.18.1/config/BuildSystem/config/setCompilers.py", line 1274, in generateCCompilerGuesses > > raise RuntimeError('C compiler you provided with -with-cc='+self.argDB['with-cc']+' cannot be found or does not work.'+'\n'+self.mesg) > > ================================================================================ > > Finishing configure run at Tue, 01 Nov 2022 13:06:09 +0900 > > > > -----Original Message----- > > From: Satish Balay > > Sent: Tuesday, November 1, 2022 11:36 AM > > To: Mohammad Ali Yaqteen > > Cc: petsc-users > > Subject: RE: [petsc-users] PETSc Windows Installation > > > > you'll have to send configure.log for this failure > > > > Satish > > > > > > On Tue, 1 Nov 2022, Mohammad Ali Yaqteen wrote: > > > > > I have checked the required Cygwin openmpi libraries and they are all installed. When I run ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90, it returns: > > > > > > $ ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > > > ============================================================================================= > > > Configuring PETSc to compile on your system > > > ====================================================================== > > > ======================= > > > TESTING: checkCCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > > > ---------------------------------------------------------------------- > > > --------- C compiler you provided with -with-cc=mpicc cannot be found > > > or does not work. > > > Cannot compile/link C with mpicc. > > > > > > As for the case of WSL2, I will try to install that on my PC. > > > Meanwhile, could you please look into this issue > > > > > > Thank you > > > > > > Ali > > > > > > -----Original Message----- > > > From: Satish Balay > > > Sent: Monday, October 31, 2022 10:56 PM > > > To: Satish Balay via petsc-users > > > Cc: Matthew Knepley ; Mohammad Ali Yaqteen > > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > BTW: If you have WSL2 on windows - it might be easier to build/use PETSc. > > > > > > Satish > > > > > > On Mon, 31 Oct 2022, Satish Balay via petsc-users wrote: > > > > > > > Make sure you have cygwin openmpi installed [and cywin blas/lapack] > > > > > > > > $ cygcheck -cd |grep openmpi > > > > libopenmpi-devel 4.1.2-1 > > > > libopenmpi40 4.1.2-1 > > > > libopenmpifh40 4.1.2-1 > > > > libopenmpiusef08_40 4.1.2-1 > > > > libopenmpiusetkr40 4.1.2-1 > > > > openmpi 4.1.2-1 > > > > $ cygcheck -cd |grep lapack > > > > liblapack-devel 3.10.1-1 > > > > liblapack0 3.10.1-1 > > > > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > > --download-f2cblaslapack > > > > > > > > Should be: > > > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > > > i.e [default] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 [an > > > > default cygwin blas/lapack] > > > > > > > > Satish > > > > > > > > > > > > On Mon, 31 Oct 2022, Matthew Knepley wrote: > > > > > > > > > On Mon, Oct 31, 2022 at 1:56 AM Mohammad Ali Yaqteen > > > > > > > > > > wrote: > > > > > > > > > > > Dear Satish > > > > > > > > > > > > When I configure PETSc with (./configure --with-cc=gcc > > > > > > --with-cxx=0 > > > > > > --with-fc=0 --download-f2cblaslapack) it runs as I shared > > > > > > initially which you said is not an issue anymore. But when I add > > > > > > (--download-scalapack > > > > > > --download-mumps) or configure with these later, it gives the > > > > > > following > > > > > > error: > > > > > > > > > > > > $ ./configure --download-scalapack --download-mumps > > > > > > > > > > > > ============================================================================================= > > > > > > Configuring PETSc to compile on your > > > > > > system > > > > > > > > > > > > ================================================================ > > > > > > == > > > > > > =========================== > > > > > > TESTING: FortranMPICheck from > > > > > > config.packages.MPI(config/BuildSystem/config/packages/MPI.py:614)******************************************************************************* > > > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > > > > > details): > > > > > > > > > > > > ---------------------------------------------------------------- > > > > > > -- > > > > > > ------------- Fortran error! mpi_init() could not be located! > > > > > > > > > > > > **************************************************************** > > > > > > ** > > > > > > ************* > > > > > > > > > > > > What could be the problem here? > > > > > > > > > > > > > > > > Without configure.log we cannot tell what went wrong. However, > > > > > from the error message, I would guess that your MPI was not built > > > > > with Fortran bindings. You need these for those packages. > > > > > > > > > > Thanks, > > > > > > > > > > Matt > > > > > > > > > > > > > > > > Your help is highly appreciated. > > > > > > > > > > > > Thank you > > > > > > Ali > > > > > > > > > > > > -----Original Message----- > > > > > > From: Satish Balay > > > > > > Sent: Saturday, October 29, 2022 2:11 PM > > > > > > To: Mohammad Ali Yaqteen > > > > > > Cc: Matthew Knepley ; petsc-users at mcs.anl.gov > > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > > > On Sat, 29 Oct 2022, Mohammad Ali Yaqteen wrote: > > > > > > > > > > > > > I haven?t accessed PETSC or given any command of my own. I was > > > > > > > just > > > > > > installing by following the instructions. I don?t know why it is > > > > > > attaching the debugger. Although it says ?Possible error running > > > > > > C/C++ > > > > > > src/snes/tutorials/ex19 with 1 MPI process? which I think is > > > > > > indicating of missing of MPI! > > > > > > > > > > > > The diff is not smart enough to detect the extra message from > > > > > > cygwin/OpenMPI - hence it assumes there is a potential problem - > > > > > > and prints the above message. > > > > > > > > > > > > But you can assume its installed properly - and use it. > > > > > > > > > > > > Satish > > > > > > > > > > > > > > From: Matthew Knepley > > > > > > > Sent: Friday, October 28, 2022 10:31 PM > > > > > > > To: Mohammad Ali Yaqteen > > > > > > > Cc: petsc-users at mcs.anl.gov > > > > > > > Subject: Re: [petsc-users] PETSc Windows Installation > > > > > > > > > > > > > > On Fri, Oct 28, 2022 at 9:11 AM Mohammad Ali Yaqteen < > > > > > > mhyaqteen at sju.ac.kr> wrote: > > > > > > > Dear Sir, > > > > > > > > > > > > > > During the Installation of PETSc in windows, I installed > > > > > > > Cygwin and the > > > > > > required libraries as mentioned on your website: > > > > > > > [cid:image001.png at 01D8EB93.7C17E410] > > > > > > > However, when I install PETSc using the configure commands > > > > > > > present on > > > > > > the petsc website: > > > > > > > > > > > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 > > > > > > --download-f2cblaslapack --download-mpich > > > > > > > > > > > > > > it gives me the following error: > > > > > > > > > > > > > > [cid:image002.png at 01D8EB93.7C17E410] > > > > > > > > > > > > > > I already installed OpenMPI using Cygwin installer but it > > > > > > > still asks me > > > > > > to. When I configure without ??download-mpich? and run ?make check? > > > > > > command, it gives me the following errors: > > > > > > > > > > > > > > [cid:image003.png at 01D8EB93.7C17E410] > > > > > > > > > > > > > > Could you kindly look into this and help me with this? Your > > > > > > > prompt > > > > > > response will highly be appreciated. > > > > > > > > > > > > > > The runs look fine. > > > > > > > > > > > > > > The test should not try to attach the debugger. Do you have > > > > > > > that in the > > > > > > PETSC_OPTIONS env variable? > > > > > > > > > > > > > > Thanks, > > > > > > > > > > > > > > Matt > > > > > > > > > > > > > > Thank you! > > > > > > > Mohammad Ali > > > > > > > Researcher, Sejong University > > > > > > > > > > > > > > > > > > > > > -- > > > > > > > What most experimenters take for granted before they begin > > > > > > > their > > > > > > experiments is infinitely more interesting than any results to > > > > > > which their experiments lead. > > > > > > > -- Norbert Wiener > > > > > > > > > > > > > > https://www.cse.buffalo.edu/~knepley/< > > > > > > http://www.cse.buffalo.edu/~knepley/> > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > From alexlindsay239 at gmail.com Tue Nov 1 23:19:04 2022 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Tue, 1 Nov 2022 23:19:04 -0500 Subject: [petsc-users] Field split degree of freedom ordering Message-ID: In the block matrices documentation, it's stated: "Note that for interlaced storage the number of rows/columns of each block must be the same size" Is interlacing defined in a global sense, or a process-local sense? So explicitly, if I don't want the same size restriction, do I need to ensure that globally all of my block 1 dofs are numbered after my block 0 dofs? Or do I need to follow that on a process-local level? Essentially in libMesh we always follow rank-major ordering. I'm asking whether for unequal row sizes, in order to split, would we need to strictly follow variable-major ordering (splitting here meaning splitting by variable)? Alex -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Tue Nov 1 23:57:23 2022 From: jed at jedbrown.org (Jed Brown) Date: Tue, 01 Nov 2022 22:57:23 -0600 Subject: [petsc-users] Field split degree of freedom ordering In-Reply-To: References: Message-ID: <87r0ym9cng.fsf@jedbrown.org> In most circumstances, you can and should interlace in some form such that each block in fieldsplit is distributed across all ranks. If you interlace at scalar granularity as described, then each block needs to be able to do that. So for the Stokes equations with equal order elements (like P1-P1 stabilized), you can interlace (u,v,w,p), but for mixed elements (like Q2-P1^discontinuous) you can't interlace in that way. You can still distribute pressure and velocity over all processes, but will need index sets to identify the velocity-pressure splits. Alexander Lindsay writes: > In the block matrices documentation, it's stated: "Note that for interlaced > storage the number of rows/columns of each block must be the same size" Is > interlacing defined in a global sense, or a process-local sense? So > explicitly, if I don't want the same size restriction, do I need to ensure > that globally all of my block 1 dofs are numbered after my block 0 dofs? Or > do I need to follow that on a process-local level? Essentially in libMesh > we always follow rank-major ordering. I'm asking whether for unequal row > sizes, in order to split, would we need to strictly follow variable-major > ordering (splitting here meaning splitting by variable)? > > Alex From stephan.koehler at math.tu-freiberg.de Wed Nov 2 05:52:59 2022 From: stephan.koehler at math.tu-freiberg.de (=?UTF-8?Q?Stephan_K=c3=b6hler?=) Date: Wed, 2 Nov 2022 11:52:59 +0100 Subject: [petsc-users] Bug report LMVM matrix class Message-ID: <6497cc1f-52ca-251b-280f-d320603aab2c@math.tu-freiberg.de> Dear PETSc/Tao team, it seems to be that there is a bug in the LMVM matrix class: In the function MatCreateVecs_LMVM, see, e.g., https://petsc.org/release/src/ksp/ksp/utils/lmvm/lmvmimpl.c.html at line 214. it is not checked if the vectors? *L, or *R are NULL.? This is, in particular, a problem if this matrix class is combined with the Schur complement matrix class, since MatMult_SchurComplement calls this function with NULL as *R, see, e.g. https://petsc.org/release/src/ksp/ksp/utils/schurm/schurm.c.html at line 66. I attach a minimal example.? You need to modify the paths to the PETSc installation in the makefile. Best regards Stephan K?hler -- Stephan K?hler TU Bergakademie Freiberg Institut f?r numerische Mathematik und Optimierung Akademiestra?e 6 09599 Freiberg Geb?udeteil Mittelbau, Zimmer 2.07 Telefon: +49 (0)3731 39-3173 (B?ro) -------------- next part -------------- A non-text attachment was scrubbed... Name: Minimal_example_schur_lmvm.tar.gz Type: application/gzip Size: 1547 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: OpenPGP_0xC9BF2C20DFE9F713.asc Type: application/pgp-keys Size: 758 bytes Desc: OpenPGP public key URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: OpenPGP_signature Type: application/pgp-signature Size: 236 bytes Desc: OpenPGP digital signature URL: From alexlindsay239 at gmail.com Wed Nov 2 07:52:01 2022 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Wed, 2 Nov 2022 07:52:01 -0500 Subject: [petsc-users] Field split degree of freedom ordering In-Reply-To: <87r0ym9cng.fsf@jedbrown.org> References: <87r0ym9cng.fsf@jedbrown.org> Message-ID: So, in the latter case, IIUC we can maintain how we distribute data among the processes (partitioning of elements) such that with respect to a `-ksp_view_pmat` nothing changes and our velocity and pressure dofs are interlaced on a global scale (e.g. each process has some velocity and pressure dofs) ... but in order to leverage field split we need those index sets in order to avoid the equal size constraint? On Tue, Nov 1, 2022 at 11:57 PM Jed Brown wrote: > In most circumstances, you can and should interlace in some form such that > each block in fieldsplit is distributed across all ranks. If you interlace > at scalar granularity as described, then each block needs to be able to do > that. So for the Stokes equations with equal order elements (like P1-P1 > stabilized), you can interlace (u,v,w,p), but for mixed elements (like > Q2-P1^discontinuous) you can't interlace in that way. You can still > distribute pressure and velocity over all processes, but will need index > sets to identify the velocity-pressure splits. > > Alexander Lindsay writes: > > > In the block matrices documentation, it's stated: "Note that for > interlaced > > storage the number of rows/columns of each block must be the same size" > Is > > interlacing defined in a global sense, or a process-local sense? So > > explicitly, if I don't want the same size restriction, do I need to > ensure > > that globally all of my block 1 dofs are numbered after my block 0 dofs? > Or > > do I need to follow that on a process-local level? Essentially in libMesh > > we always follow rank-major ordering. I'm asking whether for unequal row > > sizes, in order to split, would we need to strictly follow variable-major > > ordering (splitting here meaning splitting by variable)? > > > > Alex > -------------- next part -------------- An HTML attachment was scrubbed... URL: From yc17470 at connect.um.edu.mo Wed Nov 2 07:56:50 2022 From: yc17470 at connect.um.edu.mo (Gong Yujie) Date: Wed, 2 Nov 2022 12:56:50 +0000 Subject: [petsc-users] Can I use PETSc DMPlex to output the surface mesh? Message-ID: Dear development team, Now I'm doing a project about visualization. In the process of visualization, the surface mesh is preferred. I have two questions about the DMPlex mesh. 1. Can I output the 3D volume mesh in DMPlex as a .obj or .fbx file? Both these two meshes are just surface mesh. 2. If not, can I output part of the mesh out? Since I have some labels for this part of mesh, can I output this part of mesh (boundary surface mesh) separately? Best Regards, Jerry -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Nov 2 08:06:40 2022 From: jed at jedbrown.org (Jed Brown) Date: Wed, 02 Nov 2022 07:06:40 -0600 Subject: [petsc-users] Field split degree of freedom ordering In-Reply-To: References: <87r0ym9cng.fsf@jedbrown.org> Message-ID: <87edula4kf.fsf@jedbrown.org> Yes, the normal approach is to partition your mesh once, then for each field, resolve ownership of any interface dofs with respect to the element partition (so shared vertex velocity can land on any process that owns an adjacent element, though even this isn't strictly necessary). Alexander Lindsay writes: > So, in the latter case, IIUC we can maintain how we distribute data among > the processes (partitioning of elements) such that with respect to a > `-ksp_view_pmat` nothing changes and our velocity and pressure dofs are > interlaced on a global scale (e.g. each process has some velocity and > pressure dofs) ... but in order to leverage field split we need those index > sets in order to avoid the equal size constraint? > > On Tue, Nov 1, 2022 at 11:57 PM Jed Brown wrote: > >> In most circumstances, you can and should interlace in some form such that >> each block in fieldsplit is distributed across all ranks. If you interlace >> at scalar granularity as described, then each block needs to be able to do >> that. So for the Stokes equations with equal order elements (like P1-P1 >> stabilized), you can interlace (u,v,w,p), but for mixed elements (like >> Q2-P1^discontinuous) you can't interlace in that way. You can still >> distribute pressure and velocity over all processes, but will need index >> sets to identify the velocity-pressure splits. >> >> Alexander Lindsay writes: >> >> > In the block matrices documentation, it's stated: "Note that for >> interlaced >> > storage the number of rows/columns of each block must be the same size" >> Is >> > interlacing defined in a global sense, or a process-local sense? So >> > explicitly, if I don't want the same size restriction, do I need to >> ensure >> > that globally all of my block 1 dofs are numbered after my block 0 dofs? >> Or >> > do I need to follow that on a process-local level? Essentially in libMesh >> > we always follow rank-major ordering. I'm asking whether for unequal row >> > sizes, in order to split, would we need to strictly follow variable-major >> > ordering (splitting here meaning splitting by variable)? >> > >> > Alex >> From knepley at gmail.com Wed Nov 2 08:19:16 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 2 Nov 2022 09:19:16 -0400 Subject: [petsc-users] Can I use PETSc DMPlex to output the surface mesh? In-Reply-To: References: Message-ID: On Wed, Nov 2, 2022 at 8:57 AM Gong Yujie wrote: > Dear development team, > > Now I'm doing a project about visualization. In the process of > visualization, the surface mesh is preferred. I have two questions about > the DMPlex mesh. > > > 1. Can I output the 3D volume mesh in DMPlex as a .obj or .fbx file? > Both these two meshes are just surface mesh. > > 1) These are undocumented, proprietary formats. We would have to link Autodesk in order to write them. > > 1. If not, can I output part of the mesh out? Since I have some labels > for this part of mesh, can I output this part of mesh (boundary surface > mesh) separately? > > 2) Yes, you can output the surface or the volume to VTK or HDF5. To get just the surface, you can use https://petsc.org/main/docs/manualpages/DMPlex/DMPlexCreateSubmesh/ with the boundary label. Thanks, Matt > Best Regards, > Jerry > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Wed Nov 2 13:04:24 2022 From: bsmith at petsc.dev (Barry Smith) Date: Wed, 2 Nov 2022 14:04:24 -0400 Subject: [petsc-users] Report Bug TaoALMM class In-Reply-To: <4eec06f9-d534-7a02-9abe-6d1415f663f0@math.tu-freiberg.de> References: <4eec06f9-d534-7a02-9abe-6d1415f663f0@math.tu-freiberg.de> Message-ID: Stephan, I have located the troublesome line in TaoSetUp_ALMM() it has the line auglag->Px = tao->solution; and in alma.h it has Vec Px, LgradX, Ce, Ci, G; /* aliased vectors (do not destroy!) */ Now auglag->P in some situations alias auglag->P and in some cases auglag->Px serves to hold a portion of auglag->P. So then in TaoALMMSubsolverObjective_Private() the lines PetscCall(VecCopy(P, auglag->P)); PetscCall((*auglag->sub_obj)(auglag->parent)); causes, just as you said, tao->solution to be overwritten by the P at which the objective function is being computed. In other words, the solution of the outer Tao is aliased with the solution of the inner Tao, by design. You are definitely correct, the use of TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private in a line search would be problematic. I am not an expert at these methods or their implementations. Could you point to an actual use case within Tao that triggers the problem. Is there a set of command line options or code calls to Tao that fail due to this "design feature". Within the standard use of ALMM I do not see how the objective function would be used within a line search. The TaoSolve_ALMM() code is self-correcting in that if a trust region check fails it automatically rolls back the solution. Barry > On Oct 28, 2022, at 4:27 AM, Stephan K?hler wrote: > > Dear PETSc/Tao team, > > it seems to be that there is a bug in the TaoALMM class: > > In the methods TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private the vector where the function value for the augmented Lagrangian is evaluate > is copied into the current solution, see, e.g., https://petsc.org/release/src/tao/constrained/impls/almm/almm.c.html line 672 or 682. This causes subsolver routine to not converge if the line search for the subsolver rejects the step length 1. for some > update. In detail: > > Suppose the current iterate is xk and the current update is dxk. The line search evaluates the augmented Lagrangian now at (xk + dxk). This causes that the value (xk + dxk) is copied in the current solution. If the point (xk + dxk) is rejected, the line search should > try the point (xk + alpha * dxk), where alpha < 1. But due to the copying, what happens is that the point ((xk + dxk) + alpha * dxk) is evaluated, see, e.g., https://petsc.org/release/src/tao/linesearch/impls/armijo/armijo.c.html line 191. > > Best regards > Stephan K?hler > > -- > Stephan K?hler > TU Bergakademie Freiberg > Institut f?r numerische Mathematik und Optimierung > > Akademiestra?e 6 > 09599 Freiberg > Geb?udeteil Mittelbau, Zimmer 2.07 > > Telefon: +49 (0)3731 39-3173 (B?ro) > > From bsmith at petsc.dev Wed Nov 2 16:15:29 2022 From: bsmith at petsc.dev (Barry Smith) Date: Wed, 2 Nov 2022 17:15:29 -0400 Subject: [petsc-users] Bug report LMVM matrix class In-Reply-To: <6497cc1f-52ca-251b-280f-d320603aab2c@math.tu-freiberg.de> References: <6497cc1f-52ca-251b-280f-d320603aab2c@math.tu-freiberg.de> Message-ID: Thanks for the bug report with reproducing example. I have a fix in https://gitlab.com/petsc/petsc/-/merge_requests/5797 Barry > On Nov 2, 2022, at 6:52 AM, Stephan K?hler wrote: > > Dear PETSc/Tao team, > > it seems to be that there is a bug in the LMVM matrix class: > > In the function MatCreateVecs_LMVM, see, e.g., https://petsc.org/release/src/ksp/ksp/utils/lmvm/lmvmimpl.c.html at line 214. > it is not checked if the vectors *L, or *R are NULL. This is, in particular, a problem if this matrix class is combined with the Schur complement matrix class, since MatMult_SchurComplement > calls this function with NULL as *R, see, e.g. https://petsc.org/release/src/ksp/ksp/utils/schurm/schurm.c.html at line 66. > > I attach a minimal example. You need to modify the paths to the PETSc installation in the makefile. > > Best regards > Stephan K?hler > > -- > Stephan K?hler > TU Bergakademie Freiberg > Institut f?r numerische Mathematik und Optimierung > > Akademiestra?e 6 > 09599 Freiberg > Geb?udeteil Mittelbau, Zimmer 2.07 > > Telefon: +49 (0)3731 39-3173 (B?ro) > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From eschnetter at perimeterinstitute.ca Wed Nov 2 18:28:55 2022 From: eschnetter at perimeterinstitute.ca (Erik Schnetter) Date: Wed, 2 Nov 2022 19:28:55 -0400 Subject: [petsc-users] Redefining MPI functions as macros can break C++ code Message-ID: PETSc redefines MPI functions as macros when logging is enabled. This breaks some C++ code; see e.g. < https://github.com/AMReX-Codes/amrex/pull/3005> for an example. The reason is that macros get confused about commas in template arguments. It would be convenient if PETSc used a different way to log MPI function calls, but I can't think of a good way. Alternatively, logging could be disabled by default, or MPI logging could be disabled by default, or there could be a simple way to opt out (e.g. use `#define PETSC_LOG_MPI` after `#include ` to enable it for a source file). -erik -- Erik Schnetter http://www.perimeterinstitute.ca/personal/eschnetter/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed Nov 2 22:48:20 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 2 Nov 2022 22:48:20 -0500 (CDT) Subject: [petsc-users] Redefining MPI functions as macros can break C++ code In-Reply-To: References: Message-ID: You can define 'PETSC_HAVE_BROKEN_RECURSIVE_MACRO' and then include petsc.h in your sources to avoid these macros in amrex/application codes. PETSc logging is one of the important features - its best to not disable it (globally for all) due to this issue. Satish On Wed, 2 Nov 2022, Erik Schnetter wrote: > PETSc redefines MPI functions as macros when logging is enabled. This > breaks some C++ code; see e.g. < > https://github.com/AMReX-Codes/amrex/pull/3005> for an example. The reason > is that macros get confused about commas in template arguments. > > It would be convenient if PETSc used a different way to log MPI function > calls, but I can't think of a good way. Alternatively, logging could be > disabled by default, or MPI logging could be disabled by default, or there > could be a simple way to opt out (e.g. use `#define PETSC_LOG_MPI` after > `#include ` to enable it for a source file). > > -erik > > From ahmedlp9 at gmail.com Thu Nov 3 08:49:52 2022 From: ahmedlp9 at gmail.com (Ahmed Mansur) Date: Thu, 3 Nov 2022 09:49:52 -0400 Subject: [petsc-users] Doubt about PCILU Message-ID: Hi, I'm trying to use ILU as GMRES preconditioner (using PCILU), my question is how use ILU as 'ilutp' type like MATLAB ( ILU factorization with threshold and pivoting. ) Thanks. Regards -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Thu Nov 3 09:22:03 2022 From: hzhang at mcs.anl.gov (Zhang, Hong) Date: Thu, 3 Nov 2022 14:22:03 +0000 Subject: [petsc-users] Doubt about PCILU In-Reply-To: References: Message-ID: PETSc does not support 'ilutp' . Sequential superlu supports it. You can install petsc with superlu, then use runtime options to activate 'ilutp' , e.g., petsc/src/ksp/ksp/tutorials ./ex2 -ksp_view -pc_type ilu -pc_factor_mat_solver_type superlu -help |grep superlu ... -mat_superlu_replacetinypivot: ReplaceTinyPivot (None) ... -mat_superlu_ilu_droptol <0.0001 : 0.0001>: ILU_DropTol (None) -mat_superlu_ilu_filltol <0.01 : 0.01>: ILU_FillTol (None) -mat_superlu_ilu_fillfactor <10. : 10.>: ILU_FillFactor (None) -mat_superlu_ilu_droprull : ILU_DropRule (None) Hong ________________________________ From: petsc-users on behalf of Ahmed Mansur Sent: Thursday, November 3, 2022 8:49 AM To: petsc-users Subject: [petsc-users] Doubt about PCILU Hi, I'm trying to use ILU as GMRES preconditioner (using PCILU), my question is how use ILU as 'ilutp' type like MATLAB ( ILU factorization with threshold and pivoting. ) Thanks. Regards -------------- next part -------------- An HTML attachment was scrubbed... URL: From kavousi at mines.edu Thu Nov 3 09:56:35 2022 From: kavousi at mines.edu (Sepideh Kavousi) Date: Thu, 3 Nov 2022 14:56:35 +0000 Subject: [petsc-users] [External] Periodic boundary condition In-Reply-To: References: Message-ID: Barry, Even for the case that I am not solving any PDE equations in the FormFunction ( by setting : aF[j][i].vx=aY[j][i].vx; aF[j][i].vy=aY[j][i].vy; aF[j][i].pp=aY[j][i].pp; aF[j][i].U=aY[j][i].U aF[j][i].p=aY[j][i].p; ) I will run into segmentation error. Let me just follow what you suggested in the following link: https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html src/ksp/ksp/tutorials/ex45.c runs perfectly, but when I change bc along x direction from DM_BOUNDARY_NONE to DM_BOUNDARY_PERIODIC and delete (i==0 || i==mx-1) from if (i==0 || j==0 || k==0 || i==mx-1 || j==my-1 || k==mz-1), I run to the following error. I am not sure how else should I implement periodic bc in a problem. [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Object is in wrong state [0]PETSC ERROR: Matrix is missing diagonal entry 5 [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [1]PETSC ERROR: Object is in wrong state [1]PETSC ERROR: Matrix is missing diagonal entry 5 [1]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [2]PETSC ERROR: Object is in wrong state [2]PETSC ERROR: Matrix is missing diagonal entry 5 [2]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [2]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 [2]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [3]PETSC ERROR: Object is in wrong state [3]PETSC ERROR: Matrix is missing diagonal entry 5 [3]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [3]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 [3]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 [3]PETSC ERROR: [0]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 [0]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 [0]PETSC ERROR: Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" Petsc Release Version 3.14.2, Dec 03, 2020 [1]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 [1]PETSC ERROR: [0]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" [2]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" [3]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" [1]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c [0]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c [1]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c [1]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c [2]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c [2]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c [2]PETSC ERROR: [3]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c [3]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c [3]PETSC ERROR: #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [3]PETSC ERROR: [0]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c [0]PETSC ERROR: #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [0]PETSC ERROR: #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [1]PETSC ERROR: #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [2]PETSC ERROR: #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: [2]PETSC ERROR: #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: [3]PETSC ERROR: #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: [2]PETSC ERROR: #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [2]PETSC ERROR: [0]PETSC ERROR: #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [1]PETSC ERROR: #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [1]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [2]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [2]PETSC ERROR: #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [0]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [0]PETSC ERROR: [1]PETSC ERROR: #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [3]PETSC ERROR: #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [3]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [3]PETSC ERROR: #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: [0]PETSC ERROR: #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c [1]PETSC ERROR: PETSc Option Table entries: [1]PETSC ERROR: -ksp_monitor_short [2]PETSC ERROR: #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c [2]PETSC ERROR: PETSc Option Table entries: [2]PETSC ERROR: -ksp_monitor_short [2]PETSC ERROR: [0]PETSC ERROR: PETSc Option Table entries: [0]PETSC ERROR: -ksp_monitor_short [0]PETSC ERROR: -ksp_type fgmres [0]PETSC ERROR: -mg_levels_ksp_max_it 1 [0]PETSC ERROR: -mg_levels_ksp_type gmres [0]PETSC ERROR: -mg_levels_pc_type bjacobi [1]PETSC ERROR: -ksp_type fgmres [1]PETSC ERROR: -mg_levels_ksp_max_it 1 [1]PETSC ERROR: -mg_levels_ksp_type gmres [1]PETSC ERROR: -mg_levels_pc_type bjacobi [1]PETSC ERROR: -pc_type exotic [1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- -ksp_type fgmres [2]PETSC ERROR: -mg_levels_ksp_max_it 1 [2]PETSC ERROR: -mg_levels_ksp_type gmres [2]PETSC ERROR: -mg_levels_pc_type bjacobi [2]PETSC ERROR: -pc_type exotic [2]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- [3]PETSC ERROR: #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c [3]PETSC ERROR: PETSc Option Table entries: [3]PETSC ERROR: -ksp_monitor_short [3]PETSC ERROR: -ksp_type fgmres [3]PETSC ERROR: -mg_levels_ksp_max_it 1 [3]PETSC ERROR: -mg_levels_ksp_type gmres [3]PETSC ERROR: -mg_levels_pc_type bjacobi [3]PETSC ERROR: -pc_type exotic [3]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- [0]PETSC ERROR: -pc_type exotic [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 TACC: MPI job exited with code: 129 TACC: Shutdown complete. Exiting. Sent from Mail for Windows From: Barry Smith Sent: Tuesday, October 25, 2022 7:24 PM To: Sepideh Kavousi Cc: petsc-users at mcs.anl.gov Subject: Re: [External] [petsc-users] Periodic boundary condition Sorry I was not clear, at this point you need to type c for continue and then when it crashes in the debugger type bt Barry On Oct 25, 2022, at 6:37 PM, Sepideh Kavousi > wrote: Hello Barry, When I ran with , the error is about PetscInitialize line (Line 333). When I write bt multiple times, it just continues referring to this line. #0 0x00002b701cfed9fd in nanosleep () from /lib64/libc.so.6 #1 0x00002b701cfed894 in sleep () from /lib64/libc.so.6 #2 0x00002b70035fb4ae in PetscSleep (s=1) at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/utils/psleep.c:46 #3 0x00002b700364b8bb in PetscAttachDebugger () at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/adebug.c:405 #4 0x00002b700366cfcd in PetscOptionsCheckInitial_Private (help=0x7ffec24c7940 "\t") at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/objects/init.c:608 #5 0x00002b7003674cd6 in PetscInitialize (argc=0x7ffec24c7940, args=0x7ffec24c7940, file=0x0, help=0xffffffffffffffff
) at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/objects/pinit.c:1025 #6 0x00000000004021ce in main (argc=24, argv=0x7ffec24d14e8) at /scratch/07065/tg863649/convection/test-a9-3-options_small_MAC_pressure_old/one.c:333 Best, Sepideh Sent from Mail for Windows From: Barry Smith Sent: Friday, October 21, 2022 10:54 AM To: Sepideh Kavousi Cc: petsc-users at mcs.anl.gov Subject: Re: [External] [petsc-users] Periodic boundary condition The problem with the output below is it is not giving a clear indication where the crash occurred. #1 User provided function() line 0 in unknown file Run with the exact same options but also -start_in_debugger noxterm It should then crash in the debugger and you can type bt to see the backtrace of where it crashed, send that output. Barry Background: MatFDColoringSetUpBlocked_AIJ_Private() allocates the space that is used when evaluating the function multiple times to get the Jacobian entries. If the FormFunction writes into incorrect locations, then it will corrupt this memory that was allocated in MatFDColoringSetUpBlocked_AIJ_Private() . It does not mean necessarily that there is anything wrong in MatFDColoringSetUpBlocked_AIJ_Private() On Oct 21, 2022, at 12:32 AM, Sepideh Kavousi > wrote: Barry, I ran the code with -malloc_debug and added CHKMEMQ for all the lines inside formfunction. Following is the detail of error. Best, Sepideh [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run [0]PETSC ERROR: to get more information on the crash. [0]PETSC ERROR: PetscMallocValidate: error detected at PetscError() line 401 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/err.c [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Signal received [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 [0]PETSC ERROR: ./one.out on a skylake named c415-063.stampede2.tacc.utexas.edu by tg863649 Thu Oct 20 23:30:05 2022 [0]PETSC ERROR: Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" [0]PETSC ERROR: #1 User provided function() line 0 in unknown file [0]PETSC ERROR: Checking the memory for corruption. [0]PETSC ERROR: PetscMallocValidate: error detected at PetscSignalHandlerDefault() line 170 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/signal.c [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c application called MPI_Abort(MPI_COMM_WORLD, 50176059) - process 0 [unset]: readline failed Sent from Mail for Windows From: Barry Smith Sent: Thursday, October 20, 2022 10:27 PM To: Sepideh Kavousi Cc: petsc-users at mcs.anl.gov Subject: [External] Re: [petsc-users] Periodic boundary condition Some of the valgrind information does not appear to make sense PetscMemcpy() is not calling SNESSolve() so I suspect there must be some serious corruption of something to this impossible stack trace ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) From ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) I suggest you run with -malloc_debug instead of valgrind and see if any errors are reported. If so you can add the macro CHKMEMQ; inside your function evaluation where you write to memory to see if anything is writing to the wrong location. For example wherever you assign aF such as aF[j][i].vx=(x3+x4+x5+x6+x7+x8+x9-x1-x2)*user->hx; this can help you determine the exact line number where you are writing to the wrong location and determine what might be the cause. On Oct 20, 2022, at 6:45 PM, Sepideh Kavousi > wrote: Hello, I want to solve my 5 PDEs based on finite difference method using periodic BC in x-direction and non-periodic in y-direction but I run into error (Segmentation Violation, probably memory access out of range). For this, I discretize my equation in FormFunction function. My PDE discretization in (i,j) node needs data on (i+1,j), (i+2,j), (i-1,j), (i-2,j), (i,j+1), (i,j+2), (i,j-1), (i,j-2) points. In my previous codes that the x-direction was non-periodic (no flux) boundary condition, I: i) implemented the no flux BC for i=0 and i=Nx-1, ii) set i+2= Nx-1 in discretizing (Nx-2,j) and i+2= 0 in discretizing (1,j) iii) discretized my equation for i=1..Nx-2. I am not sure how I should do the periodic BC. From the following discussions (https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html andhttps://lists.mcs.anl.gov/pipermail/petsc-users/2016-May/029273.html), I guess I should not do step (i) (stated above) for the x-boundaries and just do step (iii) for i=0..Nx-1. If I just focus on solving 2 of the PDEs which does need data on (i+2,j), (i-2,j), (i,j+2), (i,j-2) points for discretizing equation in (i,j) node, I still run into error: Running with Valgrind (just 1 processor) gave the following file. I did not find any information which gives me hint on the error source. Can you please help me to find the error? Best, Sepideh ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x4C29E39: malloc (vg_replace_malloc.c:309) ==236074== by 0x1B79E59B: MPID_Init (mpid_init.c:1649) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1B1DA260: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1B1DA383: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E48032E: __intel_sse4_strcpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FD8BE: PetscStrcpy (str.c:354) ==236074== by 0x51FD7A3: PetscStrallocpy (str.c:188) ==236074== by 0x52A39CE: PetscEventRegLogRegister (eventlog.c:313) ==236074== by 0x527D89A: PetscLogEventRegister (plog.c:693) ==236074== by 0x6A56A20: PCBDDCInitializePackage (bddc.c:3115) ==236074== by 0x6E1A515: PCInitializePackage (dlregisksp.c:59) ==236074== by 0x6DB1A86: PCCreate (precon.c:382) ==236074== by 0x6E05167: KSPGetPC (itfunc.c:1837) ==236074== by 0x6E0FC5C: KSPSetDM (iterativ.c:1150) ==236074== by 0x6FDD27B: SNESSetDM (snes.c:5402) ==236074== by 0x70B85F7: TSGetSNES (ts.c:2914) ==236074== by 0x70BE430: TSSetDM (ts.c:4949) ==236074== by 0x402496: main (one.c:378) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) ==236074== by 0x52C9958: PetscViewerFileSetName (filev.c:659) ==236074== by 0x52B743B: PetscViewerVTKOpen (vtkv.c:279) ==236074== by 0x70C76E6: TSMonitorSolutionVTK (ts.c:5580) ==236074== by 0x40313C: FormFunction (one.c:120) ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) ==236074== by 0x5224E4B: PetscFOpen (mpiuopen.c:52) ==236074== by 0x63A074B: DMDAVTKWriteAll_VTS.A (grvtk.c:72) ==236074== by 0x639A589: DMDAVTKWriteAll (grvtk.c:545) ==236074== by 0x52B66F3: PetscViewerFlush_VTK (vtkv.c:100) ==236074== by 0x52CFAAE: PetscViewerFlush (flush.c:26) ==236074== by 0x52CEA95: PetscViewerDestroy (view.c:113) ==236074== by 0x70C7717: TSMonitorSolutionVTK (ts.c:5582) ==236074== by 0x40313C: FormFunction (one.c:120) ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Invalid write of size 4 ==236074== at 0x5F10983: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:150) ==236074== by 0x5F10983: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Address 0x3a94fa80 is 0 bytes after a block of size 73,960,000 alloc'd ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) ==236074== by 0x52305F9: PetscMallocA (mal.c:418) ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Invalid write of size 8 ==236074== at 0x5F10991: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:151) ==236074== by 0x5F10991: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Address 0x3a94fa88 is 8 bytes after a block of size 73,960,000 alloc'd ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) ==236074== by 0x52305F9: PetscMallocA (mal.c:418) ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Sent from Mail for Windows -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Thu Nov 3 10:32:14 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Thu, 3 Nov 2022 10:32:14 -0500 (CDT) Subject: [petsc-users] Redefining MPI functions as macros can break C++ code In-Reply-To: References: Message-ID: <02a5f4c5-c8d7-7657-f34d-3832f6dafba1@mcs.anl.gov> And I see the issue got fixed differently [with braces] https://github.com/AMReX-Codes/amrex/pull/3011/files Satish On Wed, 2 Nov 2022, Satish Balay via petsc-users wrote: > You can define 'PETSC_HAVE_BROKEN_RECURSIVE_MACRO' and then include > petsc.h in your sources to avoid these macros in amrex/application > codes. > > PETSc logging is one of the important features - its best to not > disable it (globally for all) due to this issue. > > Satish > > On Wed, 2 Nov 2022, Erik Schnetter wrote: > > > PETSc redefines MPI functions as macros when logging is enabled. This > > breaks some C++ code; see e.g. < > > https://github.com/AMReX-Codes/amrex/pull/3005> for an example. The reason > > is that macros get confused about commas in template arguments. > > > > It would be convenient if PETSc used a different way to log MPI function > > calls, but I can't think of a good way. Alternatively, logging could be > > disabled by default, or MPI logging could be disabled by default, or there > > could be a simple way to opt out (e.g. use `#define PETSC_LOG_MPI` after > > `#include ` to enable it for a source file). > > > > -erik > > > > > From ahmedlp9 at gmail.com Thu Nov 3 10:49:00 2022 From: ahmedlp9 at gmail.com (Ahmed Mansur) Date: Thu, 3 Nov 2022 11:49:00 -0400 Subject: [petsc-users] Doubt about PCILU In-Reply-To: References: Message-ID: Thanks a lot. El jue., 3 de noviembre de 2022 10:22 a. m., Zhang, Hong escribi?: > PETSc does not support 'ilutp' . Sequential superlu supports it. You can > install petsc with superlu, then use runtime options to activate 'ilutp' , > e.g., > petsc/src/ksp/ksp/tutorials > ./ex2 -ksp_view -pc_type ilu -pc_factor_mat_solver_type superlu -help > |grep superlu > ... > -mat_superlu_replacetinypivot: ReplaceTinyPivot (None) > ... > -mat_superlu_ilu_droptol <0.0001 : 0.0001>: ILU_DropTol (None) > -mat_superlu_ilu_filltol <0.01 : 0.01>: ILU_FillTol (None) > -mat_superlu_ilu_fillfactor <10. : 10.>: ILU_FillFactor (None) > -mat_superlu_ilu_droprull : ILU_DropRule (None) > > Hong > ------------------------------ > *From:* petsc-users on behalf of Ahmed > Mansur > *Sent:* Thursday, November 3, 2022 8:49 AM > *To:* petsc-users > *Subject:* [petsc-users] Doubt about PCILU > > Hi, I'm trying to use ILU as GMRES preconditioner (using PCILU), my > question is how use ILU as 'ilutp' type like MATLAB ( ILU factorization > with threshold and pivoting. ) > Thanks. Regards > -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Thu Nov 3 11:16:16 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 3 Nov 2022 17:16:16 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked Message-ID: Hello Jed/Barry/Petsc friends I am trying to assemble a block matrix with 3x3 in 2D and 4x4 blocks in 3D coming from the fully coupled NS equation. I am not sure I am understanding the example provided here: https://petsc.org/main/docs/manualpages/Mat/MatSetValuesBlocked/ The description says that " *v -* a logically two-dimensional array of values", while in the example is passed as a 1D array. Should I pass a 2D array like v(1:nComp,1:nComp) or an array v(1:nComp*nComp) to MatSetValuesBlocked (I am using fortran that's why I start form 1 in my arrays) ? Are *idxm *and *idxn *the global index of each block right? I guess PETSc will correctly assign each block row and column as it knows the matrix has a given structure. Thank you as always! -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Thu Nov 3 11:18:02 2022 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 3 Nov 2022 12:18:02 -0400 Subject: [petsc-users] [External] Periodic boundary condition In-Reply-To: References: Message-ID: <8213DE9B-FD24-42F5-A293-5F124764937D@petsc.dev> Can you send the code that just does what you indicate below in the FormFunction() and crashes? Then I can run it directly and track down the issue. Barry > On Nov 3, 2022, at 10:56 AM, Sepideh Kavousi wrote: > > Barry, > Even for the case that I am not solving any PDE equations in the FormFunction ( by setting : > aF[j][i].vx=aY[j][i].vx; > aF[j][i].vy=aY[j][i].vy; > aF[j][i].pp=aY[j][i].pp; > aF[j][i].U=aY[j][i].U > aF[j][i].p=aY[j][i].p; ) > I will run into segmentation error. > > Let me just follow what you suggested in the following link: https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html > > src/ksp/ksp/tutorials/ex45.c runs perfectly, but when I change bc along x direction from DM_BOUNDARY_NONE to DM_BOUNDARY_PERIODIC and delete (i==0 || i==mx-1) from if (i==0 || j==0 || k==0 || i==mx-1 || j==my-1 || k==mz-1), I run to the following error. > I am not sure how else should I implement periodic bc in a problem. > > > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Object is in wrong state > [0]PETSC ERROR: Matrix is missing diagonal entry 5 > [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [1]PETSC ERROR: Object is in wrong state > [1]PETSC ERROR: Matrix is missing diagonal entry 5 > [1]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [2]PETSC ERROR: Object is in wrong state > [2]PETSC ERROR: Matrix is missing diagonal entry 5 > [2]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [2]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 > [2]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 > [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [3]PETSC ERROR: Object is in wrong state > [3]PETSC ERROR: Matrix is missing diagonal entry 5 > [3]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [3]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 > [3]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 > [3]PETSC ERROR: [0]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 > [0]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 > [0]PETSC ERROR: Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" > Petsc Release Version 3.14.2, Dec 03, 2020 > [1]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 > [1]PETSC ERROR: [0]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c > Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" > [2]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c > Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" > [3]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c > Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" > [1]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c > [0]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c > [1]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c > [1]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c > [2]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c > [2]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c > [2]PETSC ERROR: [3]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c > [3]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c > [3]PETSC ERROR: #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c > [3]PETSC ERROR: [0]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c > [0]PETSC ERROR: #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c > [1]PETSC ERROR: #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c > [2]PETSC ERROR: #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: [2]PETSC ERROR: #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [2]PETSC ERROR: [3]PETSC ERROR: #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [3]PETSC ERROR: #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: [2]PETSC ERROR: #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c > [2]PETSC ERROR: [0]PETSC ERROR: #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c > #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c > [1]PETSC ERROR: #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c > [1]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c > #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c > [2]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c > [2]PETSC ERROR: #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c > [0]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: [1]PETSC ERROR: #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [2]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [2]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [3]PETSC ERROR: #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c > [3]PETSC ERROR: #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c > [3]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c > [3]PETSC ERROR: #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [3]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [3]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: [0]PETSC ERROR: #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c > #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c > [1]PETSC ERROR: PETSc Option Table entries: > [1]PETSC ERROR: -ksp_monitor_short > [2]PETSC ERROR: #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c > [2]PETSC ERROR: PETSc Option Table entries: > [2]PETSC ERROR: -ksp_monitor_short > [2]PETSC ERROR: [0]PETSC ERROR: PETSc Option Table entries: > [0]PETSC ERROR: -ksp_monitor_short > [0]PETSC ERROR: -ksp_type fgmres > [0]PETSC ERROR: -mg_levels_ksp_max_it 1 > [0]PETSC ERROR: -mg_levels_ksp_type gmres > [0]PETSC ERROR: -mg_levels_pc_type bjacobi > [1]PETSC ERROR: -ksp_type fgmres > [1]PETSC ERROR: -mg_levels_ksp_max_it 1 > [1]PETSC ERROR: -mg_levels_ksp_type gmres > [1]PETSC ERROR: -mg_levels_pc_type bjacobi > [1]PETSC ERROR: -pc_type exotic > [1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov ---------- > -ksp_type fgmres > [2]PETSC ERROR: -mg_levels_ksp_max_it 1 > [2]PETSC ERROR: -mg_levels_ksp_type gmres > [2]PETSC ERROR: -mg_levels_pc_type bjacobi > [2]PETSC ERROR: -pc_type exotic > [2]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov ---------- > [3]PETSC ERROR: #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c > [3]PETSC ERROR: PETSc Option Table entries: > [3]PETSC ERROR: -ksp_monitor_short > [3]PETSC ERROR: -ksp_type fgmres > [3]PETSC ERROR: -mg_levels_ksp_max_it 1 > [3]PETSC ERROR: -mg_levels_ksp_type gmres > [3]PETSC ERROR: -mg_levels_pc_type bjacobi > [3]PETSC ERROR: -pc_type exotic > [3]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov ---------- > [0]PETSC ERROR: -pc_type exotic > [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov ---------- > application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 > application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 > application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 > application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 > TACC: MPI job exited with code: 129 > TACC: Shutdown complete. Exiting. > > Sent from Mail for Windows > > From: Barry Smith > Sent: Tuesday, October 25, 2022 7:24 PM > To: Sepideh Kavousi > Cc: petsc-users at mcs.anl.gov > Subject: Re: [External] [petsc-users] Periodic boundary condition > > > Sorry I was not clear, at this point you need to type c for continue and then when it crashes in the debugger type bt > > Barry > > > > On Oct 25, 2022, at 6:37 PM, Sepideh Kavousi > wrote: > > Hello Barry, > When I ran with , the error is about PetscInitialize line (Line 333). When I write bt multiple times, it just continues referring to this line. > > #0 0x00002b701cfed9fd in nanosleep () from /lib64/libc.so.6 > #1 0x00002b701cfed894 in sleep () from /lib64/libc.so.6 > #2 0x00002b70035fb4ae in PetscSleep (s=1) at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/utils/psleep.c:46 > #3 0x00002b700364b8bb in PetscAttachDebugger () at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/adebug.c:405 > #4 0x00002b700366cfcd in PetscOptionsCheckInitial_Private (help=0x7ffec24c7940 "\t") at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/objects/init.c:608 > #5 0x00002b7003674cd6 in PetscInitialize (argc=0x7ffec24c7940, args=0x7ffec24c7940, file=0x0, help=0xffffffffffffffff
) > at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/objects/pinit.c:1025 > #6 0x00000000004021ce in main (argc=24, argv=0x7ffec24d14e8) at /scratch/07065/tg863649/convection/test-a9-3-options_small_MAC_pressure_old/one.c:333 > > Best, > Sepideh > Sent from Mail for Windows > > From: Barry Smith > Sent: Friday, October 21, 2022 10:54 AM > To: Sepideh Kavousi > Cc: petsc-users at mcs.anl.gov > Subject: Re: [External] [petsc-users] Periodic boundary condition > > > The problem with the output below is it is not giving a clear indication where the crash occurred. > > #1 User provided function() line 0 in unknown file > > > Run with the exact same options but also -start_in_debugger noxterm It should then crash in the debugger and you can type bt to see the backtrace of where it crashed, send that output. > > Barry > > Background: MatFDColoringSetUpBlocked_AIJ_Private() allocates the space that is used when evaluating the function multiple times to get the Jacobian entries. If the FormFunction writes into incorrect locations, then it will corrupt this memory that was allocated in MatFDColoringSetUpBlocked_AIJ_Private() . It does not mean necessarily that there is anything wrong in MatFDColoringSetUpBlocked_AIJ_Private() > > > > On Oct 21, 2022, at 12:32 AM, Sepideh Kavousi > wrote: > > Barry, > I ran the code with -malloc_debug and added CHKMEMQ for all the lines inside formfunction. Following is the detail of error. > Best, > Sepideh > > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors > [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run > [0]PETSC ERROR: to get more information on the crash. > [0]PETSC ERROR: PetscMallocValidate: error detected at PetscError() line 401 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/err.c > [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) > [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Signal received > [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 > [0]PETSC ERROR: ./one.out on a skylake named c415-063.stampede2.tacc.utexas.edu by tg863649 Thu Oct 20 23:30:05 2022 > [0]PETSC ERROR: Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" > [0]PETSC ERROR: #1 User provided function() line 0 in unknown file > [0]PETSC ERROR: Checking the memory for corruption. > [0]PETSC ERROR: PetscMallocValidate: error detected at PetscSignalHandlerDefault() line 170 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/signal.c > [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) > [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c > application called MPI_Abort(MPI_COMM_WORLD, 50176059) - process 0 > [unset]: readline failed > > > > > Sent from Mail for Windows > > From: Barry Smith > Sent: Thursday, October 20, 2022 10:27 PM > To: Sepideh Kavousi > Cc: petsc-users at mcs.anl.gov > Subject: [External] Re: [petsc-users] Periodic boundary condition > > > Some of the valgrind information does not appear to make sense > > PetscMemcpy() is not calling SNESSolve() so I suspect there must be some serious corruption of something to this impossible stack trace > > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > > From > > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) > ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > > I suggest you run with -malloc_debug instead of valgrind and see if any errors are reported. If so you can add the macro CHKMEMQ; inside your function evaluation where you write to memory to see if anything is writing to the wrong location. For example wherever you assign aF such as > > aF[j][i].vx=(x3+x4+x5+x6+x7+x8+x9-x1-x2)*user->hx; > > this can help you determine the exact line number where you are writing to the wrong location and determine what might be the cause. > > > > > > On Oct 20, 2022, at 6:45 PM, Sepideh Kavousi > wrote: > > Hello, > I want to solve my 5 PDEs based on finite difference method using periodic BC in x-direction and non-periodic in y-direction but I run into error (Segmentation Violation, probably memory access out of range). > For this, I discretize my equation in FormFunction function. My PDE discretization in (i,j) node needs data on (i+1,j), (i+2,j), (i-1,j), (i-2,j), (i,j+1), (i,j+2), (i,j-1), (i,j-2) points. > In my previous codes that the x-direction was non-periodic (no flux) boundary condition, I: > i) implemented the no flux BC for i=0 and i=Nx-1, > ii) set i+2= Nx-1 in discretizing (Nx-2,j) and i+2= 0 in discretizing (1,j) > iii) discretized my equation for i=1..Nx-2. > I am not sure how I should do the periodic BC. From the following discussions (https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html andhttps://lists.mcs.anl.gov/pipermail/petsc-users/2016-May/029273.html ), I guess I should not do step (i) (stated above) for the x-boundaries and just do step (iii) for i=0..Nx-1. If I just focus on solving 2 of the PDEs which does need data on (i+2,j), (i-2,j), (i,j+2), (i,j-2) points for discretizing equation in (i,j) node, I still run into error: > Running with Valgrind (just 1 processor) gave the following file. I did not find any information which gives me hint on the error source. > Can you please help me to find the error? > Best, > Sepideh > > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x4C29E39: malloc (vg_replace_malloc.c:309) > ==236074== by 0x1B79E59B: MPID_Init (mpid_init.c:1649) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) > ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) > ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) > ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Use of uninitialised value of size 8 > ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) > ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) > ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) > ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) > ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) > ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) > ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1B1DA260: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) > ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) > ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) > ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) > ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) > ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) > ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) > ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1B1DA383: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) > ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) > ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) > ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) > ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) > ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) > ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) > ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) > ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) > ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) > ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) > ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) > ==236074== by 0x40219D: main (one.c:335) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1E48032E: __intel_sse4_strcpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > ==236074== by 0x51FD8BE: PetscStrcpy (str.c:354) > ==236074== by 0x51FD7A3: PetscStrallocpy (str.c:188) > ==236074== by 0x52A39CE: PetscEventRegLogRegister (eventlog.c:313) > ==236074== by 0x527D89A: PetscLogEventRegister (plog.c:693) > ==236074== by 0x6A56A20: PCBDDCInitializePackage (bddc.c:3115) > ==236074== by 0x6E1A515: PCInitializePackage (dlregisksp.c:59) > ==236074== by 0x6DB1A86: PCCreate (precon.c:382) > ==236074== by 0x6E05167: KSPGetPC (itfunc.c:1837) > ==236074== by 0x6E0FC5C: KSPSetDM (iterativ.c:1150) > ==236074== by 0x6FDD27B: SNESSetDM (snes.c:5402) > ==236074== by 0x70B85F7: TSGetSNES (ts.c:2914) > ==236074== by 0x70BE430: TSSetDM (ts.c:4949) > ==236074== by 0x402496: main (one.c:378) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) > ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) > ==236074== by 0x52C9958: PetscViewerFileSetName (filev.c:659) > ==236074== by 0x52B743B: PetscViewerVTKOpen (vtkv.c:279) > ==236074== by 0x70C76E6: TSMonitorSolutionVTK (ts.c:5580) > ==236074== by 0x40313C: FormFunction (one.c:120) > ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) > ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) > ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) > ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) > ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) > ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) > ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) > ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) > ==236074== by 0x5224E4B: PetscFOpen (mpiuopen.c:52) > ==236074== by 0x63A074B: DMDAVTKWriteAll_VTS.A (grvtk.c:72) > ==236074== by 0x639A589: DMDAVTKWriteAll (grvtk.c:545) > ==236074== by 0x52B66F3: PetscViewerFlush_VTK (vtkv.c:100) > ==236074== by 0x52CFAAE: PetscViewerFlush (flush.c:26) > ==236074== by 0x52CEA95: PetscViewerDestroy (view.c:113) > ==236074== by 0x70C7717: TSMonitorSolutionVTK (ts.c:5582) > ==236074== by 0x40313C: FormFunction (one.c:120) > ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) > ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) > ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) > ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) > ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) > ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== > ==236074== Conditional jump or move depends on uninitialised value(s) > ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) > ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > ==236074== Invalid write of size 4 > ==236074== at 0x5F10983: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:150) > ==236074== by 0x5F10983: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== Address 0x3a94fa80 is 0 bytes after a block of size 73,960,000 alloc'd > ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) > ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) > ==236074== by 0x52305F9: PetscMallocA (mal.c:418) > ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) > ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > ==236074== Invalid write of size 8 > ==236074== at 0x5F10991: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:151) > ==236074== by 0x5F10991: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== Address 0x3a94fa88 is 8 bytes after a block of size 73,960,000 alloc'd > ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) > ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) > ==236074== by 0x52305F9: PetscMallocA (mal.c:418) > ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) > ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) > ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) > ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) > ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) > ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) > ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) > ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) > ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) > ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) > ==236074== by 0x70C363A: TSStep (ts.c:3757) > ==236074== by 0x70C1999: TSSolve (ts.c:4154) > ==236074== by 0x402594: main (one.c:391) > ==236074== > > > Sent from Mail for Windows -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 3 11:26:15 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 3 Nov 2022 12:26:15 -0400 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: Message-ID: On Thu, Nov 3, 2022 at 12:16 PM Edoardo alinovi wrote: > Hello Jed/Barry/Petsc friends > > I am trying to assemble a block matrix with 3x3 in 2D and 4x4 blocks in 3D > coming from the fully coupled NS equation. > > I am not sure I am understanding the example provided here: > https://petsc.org/main/docs/manualpages/Mat/MatSetValuesBlocked/ > > The description says that " *v -* a logically two-dimensional array of > values", while in the example is passed as a 1D array. > > Should I pass a 2D array like v(1:nComp,1:nComp) or an array > v(1:nComp*nComp) to MatSetValuesBlocked (I am using fortran that's why I > start form 1 in my arrays) ? > No, you pass a contiguous chunk of memory, but it is _logically_ 2D in that the size is (idxm * bs) x (idxn * bs) Are *idxm *and *idxn *the global index of each block right? I guess PETSc > will correctly assign each block row and column as it knows the matrix has > a given structure. > Yes. Thanks, Matt > Thank you as always! > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Thu Nov 3 11:30:42 2022 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 3 Nov 2022 12:30:42 -0400 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: Message-ID: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> You can find the current F90 interface definitions we support for MatSetValuesBlocked() in /src/mat/f90-mod/petscmat.h90 > On Nov 3, 2022, at 12:16 PM, Edoardo alinovi wrote: > > Hello Jed/Barry/Petsc friends > > I am trying to assemble a block matrix with 3x3 in 2D and 4x4 blocks in 3D coming from the fully coupled NS equation. > > I am not sure I am understanding the example provided here: > https://petsc.org/main/docs/manualpages/Mat/MatSetValuesBlocked/ > > The description says that " v - a logically two-dimensional array of values", while in the example is passed as a 1D array. > > Should I pass a 2D array like v(1:nComp,1:nComp) or an array v(1:nComp*nComp) to MatSetValuesBlocked (I am using fortran that's why I start form 1 in my arrays) ? We intend to support both approaches, whatever is most convenient for the rest of your code. (Perhaps more interface definitions are needed?) When we say "logically" two-dimensions this is intended to mean that you can pass a one dimensional array that contains all the nonzeros in the block in the order of the first column, followed by the second column etc. (How Fortran handles two dimensional arrays normally). But in most circumstances I would guess providing directly the two dimensal Fortran array is more natural. > > Are idxm and idxn the global index of each block right? I guess PETSc will correctly assign each block row and column as it knows the matrix has a given structure. Yes > > Thank you as always! > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Thu Nov 3 11:30:52 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 3 Nov 2022 17:30:52 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: Message-ID: Hello Matt, I see, so from an operation point of view, I should pass a 1D array... How should I unroll my nComp x nComp matrix? by row or by column? e.g. say that my block is: A=[ 1 2 3 4 5 6 7 8 9 ] than v = [1 2 3 4 5 6 7 8 9] or [1 4 7 2 5 8 3 6 9]? -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Thu Nov 3 11:33:50 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 3 Nov 2022 17:33:50 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> Message-ID: Hi Barry, Thanks for popping in. This is my code: * blockValues = this%getDiagonalBlockValues(iElement=iElement) call MatSetValuesBlocked(this%A, 4-bdim, mesh%cellGlobalAddr(iElement)-1, 4-bdim, mesh%cellGlobalAddr(iElement)-1, blockValues, INSERT_VALUES, ierr)* *blockValues *is a 3x3 or 4x4 matrix and I am passing it straight away. -------------- next part -------------- An HTML attachment was scrubbed... URL: From stephan.koehler at math.tu-freiberg.de Thu Nov 3 11:46:02 2022 From: stephan.koehler at math.tu-freiberg.de (=?UTF-8?Q?Stephan_K=c3=b6hler?=) Date: Thu, 3 Nov 2022 17:46:02 +0100 Subject: [petsc-users] Report Bug TaoALMM class In-Reply-To: References: <4eec06f9-d534-7a02-9abe-6d1415f663f0@math.tu-freiberg.de> Message-ID: <14f2cdd6-9cbe-20a6-0c7d-3006b2ee4dc1@math.tu-freiberg.de> Barry, so far, I have not experimented with trust-region methods, but I can imagine that this "design feature" causes no problem for trust-region methods, if the old point is saved and after the trust-region check fails the old point is copied to the actual point.? But the implementation of the Armijo line search method does not work that way.? Here, the actual point will always be overwritten.? Only if the line search fails, then the old point is restored, but then the TaoSolve method ends with a line search failure. If you have an example for your own, you can switch the Armijo line search by the option -tao_ls_type armijo.? The thing is that it will cause no problems if the line search accepts the steps with step length one. It is also possible that, by luck, it will cause no problems, if the "excessive" step brings a reduction of the objective Otherwise, I attach my example, which is not minimal, but here you can see that it causes problems.? You need to set the paths to the PETSc library in the makefile.? You find the options for this problem in the run_test_tao_neohooke.sh script. The import part begins at line 292 in test_tao_neohooke.cpp Stephan On 02.11.22 19:04, Barry Smith wrote: > Stephan, > > I have located the troublesome line in TaoSetUp_ALMM() it has the line > > auglag->Px = tao->solution; > > and in alma.h it has > > Vec Px, LgradX, Ce, Ci, G; /* aliased vectors (do not destroy!) */ > > Now auglag->P in some situations alias auglag->P and in some cases auglag->Px serves to hold a portion of auglag->P. So then in TaoALMMSubsolverObjective_Private() > the lines > > PetscCall(VecCopy(P, auglag->P)); > PetscCall((*auglag->sub_obj)(auglag->parent)); > > causes, just as you said, tao->solution to be overwritten by the P at which the objective function is being computed. In other words, the solution of the outer Tao is aliased with the solution of the inner Tao, by design. > > You are definitely correct, the use of TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private in a line search would be problematic. > > I am not an expert at these methods or their implementations. Could you point to an actual use case within Tao that triggers the problem. Is there a set of command line options or code calls to Tao that fail due to this "design feature". Within the standard use of ALMM I do not see how the objective function would be used within a line search. The TaoSolve_ALMM() code is self-correcting in that if a trust region check fails it automatically rolls back the solution. > > Barry > > > > >> On Oct 28, 2022, at 4:27 AM, Stephan K?hler wrote: >> >> Dear PETSc/Tao team, >> >> it seems to be that there is a bug in the TaoALMM class: >> >> In the methods TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private the vector where the function value for the augmented Lagrangian is evaluate >> is copied into the current solution, see, e.g.,https://petsc.org/release/src/tao/constrained/impls/almm/almm.c.html line 672 or 682. This causes subsolver routine to not converge if the line search for the subsolver rejects the step length 1. for some >> update. In detail: >> >> Suppose the current iterate is xk and the current update is dxk. The line search evaluates the augmented Lagrangian now at (xk + dxk). This causes that the value (xk + dxk) is copied in the current solution. If the point (xk + dxk) is rejected, the line search should >> try the point (xk + alpha * dxk), where alpha < 1. But due to the copying, what happens is that the point ((xk + dxk) + alpha * dxk) is evaluated, see, e.g.,https://petsc.org/release/src/tao/linesearch/impls/armijo/armijo.c.html line 191. >> >> Best regards >> Stephan K?hler >> >> -- >> Stephan K?hler >> TU Bergakademie Freiberg >> Institut f?r numerische Mathematik und Optimierung >> >> Akademiestra?e 6 >> 09599 Freiberg >> Geb?udeteil Mittelbau, Zimmer 2.07 >> >> Telefon: +49 (0)3731 39-3173 (B?ro) >> >> -- Stephan K?hler TU Bergakademie Freiberg Institut f?r numerische Mathematik und Optimierung Akademiestra?e 6 09599 Freiberg Geb?udeteil Mittelbau, Zimmer 2.07 Telefon: +49 (0)3731 39-3173 (B?ro) -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Minimal_example_without_vtk_2.tar.gz Type: application/gzip Size: 53999 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: OpenPGP_0xC9BF2C20DFE9F713.asc Type: application/pgp-keys Size: 758 bytes Desc: OpenPGP public key URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: OpenPGP_signature Type: application/pgp-signature Size: 236 bytes Desc: OpenPGP digital signature URL: From edoardo.alinovi at gmail.com Thu Nov 3 12:02:13 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 3 Nov 2022 18:02:13 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> Message-ID: Also, just to be 100% sure, is m and equal to 3 in 2 and 4 in 3D if my blocks are 3x3 and 4x4 respectively? -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Thu Nov 3 12:16:25 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 3 Nov 2022 18:16:25 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> Message-ID: Ah, I was forgetting the most important thing... Are the size of idxm and idxn equal to one if I insert 1 block or should I specify all the rows and columns in the block? I am getting some memory issues with unallocated non zero values so I must have made some mistake here... :( Sorry for the ton of questions! -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Thu Nov 3 13:13:11 2022 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 3 Nov 2022 14:13:11 -0400 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> Message-ID: <0D3C529A-3B69-4F2D-8CE9-ED87BB482F2F@petsc.dev> > On Nov 3, 2022, at 1:16 PM, Edoardo alinovi wrote: > > Ah, I was forgetting the most important thing... Are the size of idxm and idxn equal to one if I insert 1 block or should I specify all the rows and columns in the block? Yes, for a single block they are one. The block size is set with MatSetBlockSize() or in the preallocation routines or automatically if you use DMCreateMatrix(), the block size is not passed in during MatSetValuesBlocked() the matrix uses its internal value. Barry > > I am getting some memory issues with unallocated non zero values so I must have made some mistake here... :( > > Sorry for the ton of questions! From mike at mikewelland.com Thu Nov 3 13:33:00 2022 From: mike at mikewelland.com (Mike Welland) Date: Thu, 3 Nov 2022 14:33:00 -0400 Subject: [petsc-users] Advice on coupling linear physics with Allen-Cahn Message-ID: I am coupling a linear diffusion equation with Allen-Cahn in a time dependent problem. I'd like to take advantage of the linear block to speed things up. I'm trying two approaches: 1. Allen-Cahn with double well potential: phi^2*(1-phi^2), which makes it nonlinear. The best performance I have is with geometric multigrid on the full system. I tried using a schur complement with the linear diffusion block on A00 (both inside mg levels, and just mg on S) but didn't get good performance. 2. Allen-Cahn with the 'obstacle' potential: phi*(1-phi) which is linear but needs the vi solver to keep 0<=phi<=1. My whole system becomes linear (great!) but needs the nonlinear steps for the vi solver, and I'm not sure if it is reusing the factorization since the DOFs are being changed with the active step. Any suggestion / guidance would be appreciated! Thanks! -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Thu Nov 3 13:45:56 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 3 Nov 2022 19:45:56 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: <0D3C529A-3B69-4F2D-8CE9-ED87BB482F2F@petsc.dev> References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> <0D3C529A-3B69-4F2D-8CE9-ED87BB482F2F@petsc.dev> Message-ID: Yes I am doing: call MatMPIBAIJSetPreallocation(this%A, 4-bdim, flubioSolvers%d_nz, mesh%d_nnz, flubioSolvers%o_nz, mesh%o_nnz, ierr) with d_nnz the number of diagonal blocks and o_nnz the number of off-diagonal blocks. However I am getting this: [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Argument out of range [0]PETSC ERROR: New nonzero at (3,3) caused a malloc Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn off this check [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.18.0, Sep 30, 2022 [0]PETSC ERROR: flubio_coupled on a gnu named alienware by edo Thu Nov 3 18:19:30 2022 [0]PETSC ERROR: Configure options PETSC_ARCH=gnu FOPTFLAGS=-O3 COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -download-fblaslapack=1 -download-superlu_dist -download-mumps -download-hypre -download-metis -download-parmetis -download-scalapack -download-ml -download-slepc -download-hpddm -download-cmake -with-mpi-dir=/home/edo/software/openmpi-4.1.1/build/ [0]PETSC ERROR: #1 MatSetValuesBlocked_SeqBAIJ_Inlined() at /home/edo/software/petsc-3.18.0/src/mat/impls/baij/mpi/mpibaij.c:318 [0]PETSC ERROR: #2 MatSetValuesBlocked_MPIBAIJ() at /home/edo/software/petsc-3.18.0/src/mat/impls/baij/mpi/mpibaij.c:389 [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [1]PETSC ERROR: Argument out of range [1]PETSC ERROR: New nonzero at (0,0) caused a malloc Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn off this check [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [1]PETSC ERROR: Petsc Release Version 3.18.0, Sep 30, 2022 [1]PETSC ERROR: flubio_coupled on a gnu named alienware by edo Thu Nov 3 18:19:30 2022 [1]PETSC ERROR: Configure options PETSC_ARCH=gnu FOPTFLAGS=-O3 COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -download-fblaslapack=1 -download-superlu_dist -download-mumps -download-hypre -download-metis -download-parmetis -download-scalapack -download-ml -download-slepc -download-hpddm -download-cmake -with-mpi-dir=/home/edo/software/openmpi-4.1.1/build/ [1]PETSC ERROR: #1 MatSetValuesBlocked_SeqBAIJ_Inlined() at /home/edo/software/petsc-3.18.0/src/mat/impls/baij/mpi/mpibaij.c:318 [1]PETSC ERROR: #2 MatSetValuesBlocked_MPIBAIJ() at /home/edo/software/petsc-3.18.0/src/mat/impls/baij/mpi/mpibaij.c:419 [1]PETSC ERROR: #3 MatSetValuesBlocked() at /home/edo/software/petsc-3.18.0/src/mat/interface/matrix.c:1978 [1]PETSC ERROR: #4 MatSetValuesBlocked_SeqBAIJ_Inlined() at /home/edo/software/petsc-3.18.0/src/mat/impls/baij/mpi/mpibaij.c:318 [1]PETSC ERROR: #5 MatSetValuesBlocked_MPIBAIJ() at /home/edo/software/petsc-3.18.0/src/mat/impls/baij/mpi/mpibaij.c:419 [1]PETSC ERROR: #6 MatSetValuesBlocked() at /home/edo/software/petsc-3.18.0/src/mat/interface/matrix.c:1978 [0]PETSC ERROR: #3 MatSetValuesBlocked() at /home/edo/software/petsc-3.18.0/src/mat/interface/matrix.c:1978 [0]PETSC ERROR: #4 MatSetValuesBlocked_SeqBAIJ_Inlined() at /home/edo/software/petsc-3.18.0/src/mat/impls/baij/mpi/mpibaij.c:318 [0]PETSC ERROR: #5 MatSetValuesBlocked_MPIBAIJ() at /home/edo/software/petsc-3.18.0/src/mat/impls/baij/mpi/mpibaij.c:419 [0]PETSC ERROR: #6 MatAssemblyEnd_MPIBAIJ() at /home/edo/software/petsc-3.18.0/src/mat/impls/baij/mpi/mpibaij.c:906 [0]PETSC ERROR: #7 MatAssemblyEnd() at /home/edo/software/petsc-3.18.0/src/mat/interface/matrix.c:5696 -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Thu Nov 3 14:11:01 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 3 Nov 2022 20:11:01 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> <0D3C529A-3B69-4F2D-8CE9-ED87BB482F2F@petsc.dev> Message-ID: Just to give a bit more of context I am doing like this: call MatCreate(PETSC_COMM_WORLD, this%A, ierr) call MatSetSizes(this%A, lm, lm, M, M, ierr) ! lm is the local size of the matrix , M the global one call MatSetType(this%A, myType, ierr) ! myType is MATMPIBAIJ call MatSetBlockSize(this%A, 4-bdim, ierr) ! 4-bdim is equal to 3 in this case call MatMPIBAIJSetPreallocation(this%A, 4-bdim, flubioSolvers%d_nz, mesh%d_nnz, flubioSolvers%o_nz, mesh%o_nnz, ierr) ! d_nnz and o_nnz is the number of diagonal and off diagonal non zero blocks call MatSetUp(this%A, ierr) ... some non relevant code ..... call MatSetValuesBlocked(this%A, 4-bdim, mesh%cellGlobalAddr(iElement)-1, 4-bdim, mesh%cellGlobalAddr(iElement)-1, blockValues, INSERT_VALUES, ierr) mesh%cellGlobalAddr(iElement)-1 is an integer equal (not an array) to the block element number the block matrix *blockValues (3x3) *belongs to. Any evident errors? -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Thu Nov 3 15:06:55 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 3 Nov 2022 21:06:55 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> <0D3C529A-3B69-4F2D-8CE9-ED87BB482F2F@petsc.dev> Message-ID: Well, definitely using MatsetValuesBlocked in a bad way. Instead if seeing 11 in the first place 3 rows and 3 columns, I see all zeros and random numbers in row 9 10 and 11... [image: image.png] -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 36718 bytes Desc: not available URL: From bsmith at petsc.dev Thu Nov 3 15:08:49 2022 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 3 Nov 2022 16:08:49 -0400 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> <0D3C529A-3B69-4F2D-8CE9-ED87BB482F2F@petsc.dev> Message-ID: The error indicates not enough nonzero blocks are preallocated for. Try something really simple, preallocate for one block and put in one block then call MatAssemblyBegin/End(), MatView(), if that works then work up to your full problem. Barry > On Nov 3, 2022, at 3:11 PM, Edoardo alinovi wrote: > > Just to give a bit more of context I am doing like this: > > call MatCreate(PETSC_COMM_WORLD, this%A, ierr) > > call MatSetSizes(this%A, lm, lm, M, M, ierr) ! lm is the local size of the matrix , M the global one > > call MatSetType(this%A, myType, ierr) ! myType is MATMPIBAIJ > > call MatSetBlockSize(this%A, 4-bdim, ierr) ! 4-bdim is equal to 3 in this case > > call MatMPIBAIJSetPreallocation(this%A, 4-bdim, flubioSolvers%d_nz, mesh%d_nnz, flubioSolvers%o_nz, mesh%o_nnz, ierr) ! d_nnz and o_nnz is the number of diagonal and off diagonal non zero blocks > > call MatSetUp(this%A, ierr) > > ... some non relevant code ..... > > call MatSetValuesBlocked(this%A, 4-bdim, mesh%cellGlobalAddr(iElement)-1, 4-bdim, mesh%cellGlobalAddr(iElement)-1, blockValues, INSERT_VALUES, ierr) mesh%cellGlobalAddr(iElement)-1 is an integer equal (not an array) to the block element number the block matrix blockValues (3x3) belongs to. > > Any evident errors? > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Thu Nov 3 15:21:03 2022 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 3 Nov 2022 16:21:03 -0400 Subject: [petsc-users] Advice on coupling linear physics with Allen-Cahn In-Reply-To: References: Message-ID: <3CFEF5FC-7B55-4899-B688-6C1389CE112D@petsc.dev> > On Nov 3, 2022, at 2:33 PM, Mike Welland wrote: > > I am coupling a linear diffusion equation with Allen-Cahn in a time dependent problem. I'd like to take advantage of the linear block to speed things up. I'm trying two approaches: > > 1. Allen-Cahn with double well potential: phi^2*(1-phi^2), which makes it nonlinear. The best performance I have is with geometric multigrid on the full system. I tried using a schur complement with the linear diffusion block on A00 (both inside mg levels, and just mg on S) but didn't get good performance. With geometric multigrid there is not much setup cost (so reusing it is not important). > > 2. Allen-Cahn with the 'obstacle' potential: phi*(1-phi) which is linear but needs the vi solver to keep 0<=phi<=1. My whole system becomes linear (great!) but needs the nonlinear steps for the vi solver, and I'm not sure if it is reusing the factorization since the DOFs are being changed with the active step. You are correct. Since the problem (size) changes for each solve not much of anything can be directly reused in the solver. But with geometric multigrid there is not much setup cost (so reusing it is not important). > > Any suggestion / guidance would be appreciated! > Thanks! From bsmith at petsc.dev Thu Nov 3 16:15:40 2022 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 3 Nov 2022 17:15:40 -0400 Subject: [petsc-users] Report Bug TaoALMM class In-Reply-To: <14f2cdd6-9cbe-20a6-0c7d-3006b2ee4dc1@math.tu-freiberg.de> References: <4eec06f9-d534-7a02-9abe-6d1415f663f0@math.tu-freiberg.de> <14f2cdd6-9cbe-20a6-0c7d-3006b2ee4dc1@math.tu-freiberg.de> Message-ID: <5E53FE56-5C68-4F06-8A48-54ACBDC800C7@petsc.dev> Thanks for your response and the code. I understand the potential problem and how your code demonstrates a bug if the TaoALMMSubsolverObjective() is used in the manner you use in the example where you directly call TaoComputeObjective() multiple times line a line search code might. What I don't have or understand is how to reproduce the problem in a real code that uses Tao. That is where the Tao Armijo line search code has a problem when it is used (somehow) in a Tao solver with ALMM. You suggest "If you have an example for your own, you can switch the Armijo line search by the option -tao_ls_type armijo. The thing is that it will cause no problems if the line search accepts the steps with step length one." I don't see how to do this if I use -tao_type almm I cannot use -tao_ls_type armijo; that is the option -tao_ls_type doesn't seem to me to be usable in the context of almm (since almm internally does directly its own trust region approach for globalization). If we remove the if (1) code from your example, is there some Tao options I can use to get the bug to appear inside the Tao solve? I'll try to explain again, I agree that the fact that the Tao solution is aliased (within the ALMM solver) is a problem with repeated calls to TaoComputeObjective() but I cannot see how these repeated calls could ever happen in the use of TaoSolve() with the ALMM solver. That is when is this "design problem" a true problem as opposed to just a potential problem that can be demonstrated in artificial code? The reason I need to understand the non-artificial situation it breaks things is to come up with an appropriate correction for the current code. Barry > On Nov 3, 2022, at 12:46 PM, Stephan K?hler wrote: > > Barry, > > so far, I have not experimented with trust-region methods, but I can imagine that this "design feature" causes no problem for trust-region methods, if the old point is saved and after the trust-region check fails the old point is copied to the actual point. But the implementation of the Armijo line search method does not work that way. Here, the actual point will always be overwritten. Only if the line search fails, then the old point is restored, but then the TaoSolve method ends with a line search failure. > > If you have an example for your own, you can switch the Armijo line search by the option -tao_ls_type armijo. The thing is that it will cause no problems if the line search accepts the steps with step length one. > It is also possible that, by luck, it will cause no problems, if the "excessive" step brings a reduction of the objective > > Otherwise, I attach my example, which is not minimal, but here you can see that it causes problems. You need to set the paths to the PETSc library in the makefile. You find the options for this problem in the run_test_tao_neohooke.sh script. > The import part begins at line 292 in test_tao_neohooke.cpp > > Stephan > > On 02.11.22 19:04, Barry Smith wrote: >> Stephan, >> >> I have located the troublesome line in TaoSetUp_ALMM() it has the line >> >> auglag->Px = tao->solution; >> >> and in alma.h it has >> >> Vec Px, LgradX, Ce, Ci, G; /* aliased vectors (do not destroy!) */ >> >> Now auglag->P in some situations alias auglag->P and in some cases auglag->Px serves to hold a portion of auglag->P. So then in TaoALMMSubsolverObjective_Private() >> the lines >> >> PetscCall(VecCopy(P, auglag->P)); >> PetscCall((*auglag->sub_obj)(auglag->parent)); >> >> causes, just as you said, tao->solution to be overwritten by the P at which the objective function is being computed. In other words, the solution of the outer Tao is aliased with the solution of the inner Tao, by design. >> >> You are definitely correct, the use of TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private in a line search would be problematic. >> >> I am not an expert at these methods or their implementations. Could you point to an actual use case within Tao that triggers the problem. Is there a set of command line options or code calls to Tao that fail due to this "design feature". Within the standard use of ALMM I do not see how the objective function would be used within a line search. The TaoSolve_ALMM() code is self-correcting in that if a trust region check fails it automatically rolls back the solution. >> >> Barry >> >> >> >> >>> On Oct 28, 2022, at 4:27 AM, Stephan K?hler wrote: >>> >>> Dear PETSc/Tao team, >>> >>> it seems to be that there is a bug in the TaoALMM class: >>> >>> In the methods TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private the vector where the function value for the augmented Lagrangian is evaluate >>> is copied into the current solution, see, e.g., https://petsc.org/release/src/tao/constrained/impls/almm/almm.c.html line 672 or 682. This causes subsolver routine to not converge if the line search for the subsolver rejects the step length 1. for some >>> update. In detail: >>> >>> Suppose the current iterate is xk and the current update is dxk. The line search evaluates the augmented Lagrangian now at (xk + dxk). This causes that the value (xk + dxk) is copied in the current solution. If the point (xk + dxk) is rejected, the line search should >>> try the point (xk + alpha * dxk), where alpha < 1. But due to the copying, what happens is that the point ((xk + dxk) + alpha * dxk) is evaluated, see, e.g., https://petsc.org/release/src/tao/linesearch/impls/armijo/armijo.c.html line 191. >>> >>> Best regards >>> Stephan K?hler >>> >>> -- >>> Stephan K?hler >>> TU Bergakademie Freiberg >>> Institut f?r numerische Mathematik und Optimierung >>> >>> Akademiestra?e 6 >>> 09599 Freiberg >>> Geb?udeteil Mittelbau, Zimmer 2.07 >>> >>> Telefon: +49 (0)3731 39-3173 (B?ro) >>> >>> > > -- > Stephan K?hler > TU Bergakademie Freiberg > Institut f?r numerische Mathematik und Optimierung > > Akademiestra?e 6 > 09599 Freiberg > Geb?udeteil Mittelbau, Zimmer 2.07 > > Telefon: +49 (0)3731 39-3173 (B?ro) > -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Thu Nov 3 16:56:36 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 3 Nov 2022 22:56:36 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> <0D3C529A-3B69-4F2D-8CE9-ED87BB482F2F@petsc.dev> Message-ID: Barry, Can you please provide me with an example on how to use MatSetValuesBlocked? To play it easy, let's say that I want to insert a 3x3 block matrix b into the matrix A, rows 0-2, columns 0-2. Up to what I've understood (very few apparently XD ), I would do like this: b(3,3) = 11.0 call MatSetValuesBlocked(A, 3, 0, 3, 0, b, INSERT_VALUES, ierr). This does not work at all, I get this result that does not make any sense ? [image: image.png] It places 6 values instead of 9 and it they are in odd locations (0 1 2 9 10 11). Also I noted that I am getting different results if in place of the zero in red I use a fortran integer ? Super thanks for the help -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 14274 bytes Desc: not available URL: From bsmith at petsc.dev Thu Nov 3 18:34:56 2022 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 3 Nov 2022 19:34:56 -0400 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> <0D3C529A-3B69-4F2D-8CE9-ED87BB482F2F@petsc.dev> Message-ID: You should pass 1 and 1 not 3, because you are setting one block. Regarding all the integer values passed in to PETSc routines To be completely portable you need to declare them as PetscInt and pass the variables. But if you do not use --with-64-bit-indices in ./configure and you do not use some Fortran compiler option to promote all integers to 64 bit integers just passing in directly the 0 etc is fine. The reason you get different results with a 0 or a Fortran integer is because of the 3 PETSc tries to three indices from 0 value you pass in so it is readying memory it is not suppose to be reading. Once you change the 3 to 1 it will likely be fine. Barry > On Nov 3, 2022, at 5:56 PM, Edoardo alinovi wrote: > > Barry, > > Can you please provide me with an example on how to use MatSetValuesBlocked? > > To play it easy, let's say that I want to insert a 3x3 block matrix b into the matrix A, rows 0-2, columns 0-2. Up to what I've understood (very few apparently XD ), I would do like this: > > b(3,3) = 11.0 > call MatSetValuesBlocked(A, 3, 0, 3, 0, b, INSERT_VALUES, ierr). > > This does not work at all, I get this result that does not make any sense ? > > > > It places 6 values instead of 9 and it they are in odd locations (0 1 2 9 10 11). > > Also I noted that I am getting different results if in place of the zero in red I use a fortran integer ? > > Super thanks for the help -------------- next part -------------- An HTML attachment was scrubbed... URL: From kavousi at mines.edu Thu Nov 3 18:39:24 2022 From: kavousi at mines.edu (Sepideh Kavousi) Date: Thu, 3 Nov 2022 23:39:24 +0000 Subject: [petsc-users] [External] Periodic boundary condition In-Reply-To: <8213DE9B-FD24-42F5-A293-5F124764937D@petsc.dev> References: <8213DE9B-FD24-42F5-A293-5F124764937D@petsc.dev> Message-ID: Please find the attached. Best, Sepideh Sent from Mail for Windows From: Barry Smith Sent: Thursday, November 3, 2022 12:18 PM To: Sepideh Kavousi Cc: petsc-users at mcs.anl.gov Subject: Re: [External] [petsc-users] Periodic boundary condition Can you send the code that just does what you indicate below in the FormFunction() and crashes? Then I can run it directly and track down the issue. Barry On Nov 3, 2022, at 10:56 AM, Sepideh Kavousi wrote: Barry, Even for the case that I am not solving any PDE equations in the FormFunction ( by setting : aF[j][i].vx=aY[j][i].vx; aF[j][i].vy=aY[j][i].vy; aF[j][i].pp=aY[j][i].pp; aF[j][i].U=aY[j][i].U aF[j][i].p=aY[j][i].p; ) I will run into segmentation error. Let me just follow what you suggested in the following link: https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html src/ksp/ksp/tutorials/ex45.c runs perfectly, but when I change bc along x direction from DM_BOUNDARY_NONE to DM_BOUNDARY_PERIODIC and delete (i==0 || i==mx-1) from if (i==0 || j==0 || k==0 || i==mx-1 || j==my-1 || k==mz-1), I run to the following error. I am not sure how else should I implement periodic bc in a problem. [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Object is in wrong state [0]PETSC ERROR: Matrix is missing diagonal entry 5 [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [1]PETSC ERROR: Object is in wrong state [1]PETSC ERROR: Matrix is missing diagonal entry 5 [1]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [2]PETSC ERROR: Object is in wrong state [2]PETSC ERROR: Matrix is missing diagonal entry 5 [2]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [2]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 [2]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [3]PETSC ERROR: Object is in wrong state [3]PETSC ERROR: Matrix is missing diagonal entry 5 [3]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [3]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 [3]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 [3]PETSC ERROR: [0]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 [0]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 [0]PETSC ERROR: Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" Petsc Release Version 3.14.2, Dec 03, 2020 [1]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 [1]PETSC ERROR: [0]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" [2]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" [3]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" [1]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c [0]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c [1]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c [1]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c [2]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c [2]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c [2]PETSC ERROR: [3]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c [3]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c [3]PETSC ERROR: #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [3]PETSC ERROR: [0]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c [0]PETSC ERROR: #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [0]PETSC ERROR: #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [1]PETSC ERROR: #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [2]PETSC ERROR: #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: [2]PETSC ERROR: #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: [3]PETSC ERROR: #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: [2]PETSC ERROR: #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [2]PETSC ERROR: [0]PETSC ERROR: #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [1]PETSC ERROR: #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [1]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [2]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [2]PETSC ERROR: #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [0]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [0]PETSC ERROR: [1]PETSC ERROR: #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [3]PETSC ERROR: #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [3]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [3]PETSC ERROR: #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: [0]PETSC ERROR: #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c [1]PETSC ERROR: PETSc Option Table entries: [1]PETSC ERROR: -ksp_monitor_short [2]PETSC ERROR: #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c [2]PETSC ERROR: PETSc Option Table entries: [2]PETSC ERROR: -ksp_monitor_short [2]PETSC ERROR: [0]PETSC ERROR: PETSc Option Table entries: [0]PETSC ERROR: -ksp_monitor_short [0]PETSC ERROR: -ksp_type fgmres [0]PETSC ERROR: -mg_levels_ksp_max_it 1 [0]PETSC ERROR: -mg_levels_ksp_type gmres [0]PETSC ERROR: -mg_levels_pc_type bjacobi [1]PETSC ERROR: -ksp_type fgmres [1]PETSC ERROR: -mg_levels_ksp_max_it 1 [1]PETSC ERROR: -mg_levels_ksp_type gmres [1]PETSC ERROR: -mg_levels_pc_type bjacobi [1]PETSC ERROR: -pc_type exotic [1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- -ksp_type fgmres [2]PETSC ERROR: -mg_levels_ksp_max_it 1 [2]PETSC ERROR: -mg_levels_ksp_type gmres [2]PETSC ERROR: -mg_levels_pc_type bjacobi [2]PETSC ERROR: -pc_type exotic [2]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- [3]PETSC ERROR: #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c [3]PETSC ERROR: PETSc Option Table entries: [3]PETSC ERROR: -ksp_monitor_short [3]PETSC ERROR: -ksp_type fgmres [3]PETSC ERROR: -mg_levels_ksp_max_it 1 [3]PETSC ERROR: -mg_levels_ksp_type gmres [3]PETSC ERROR: -mg_levels_pc_type bjacobi [3]PETSC ERROR: -pc_type exotic [3]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- [0]PETSC ERROR: -pc_type exotic [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 TACC: MPI job exited with code: 129 TACC: Shutdown complete. Exiting. Sent from Mail for Windows From: Barry Smith Sent: Tuesday, October 25, 2022 7:24 PM To: Sepideh Kavousi Cc: petsc-users at mcs.anl.gov Subject: Re: [External] [petsc-users] Periodic boundary condition Sorry I was not clear, at this point you need to type c for continue and then when it crashes in the debugger type bt Barry On Oct 25, 2022, at 6:37 PM, Sepideh Kavousi > wrote: Hello Barry, When I ran with , the error is about PetscInitialize line (Line 333). When I write bt multiple times, it just continues referring to this line. #0 0x00002b701cfed9fd in nanosleep () from /lib64/libc.so.6 #1 0x00002b701cfed894 in sleep () from /lib64/libc.so.6 #2 0x00002b70035fb4ae in PetscSleep (s=1) at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/utils/psleep.c:46 #3 0x00002b700364b8bb in PetscAttachDebugger () at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/adebug.c:405 #4 0x00002b700366cfcd in PetscOptionsCheckInitial_Private (help=0x7ffec24c7940 "\t") at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/objects/init.c:608 #5 0x00002b7003674cd6 in PetscInitialize (argc=0x7ffec24c7940, args=0x7ffec24c7940, file=0x0, help=0xffffffffffffffff
) at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/objects/pinit.c:1025 #6 0x00000000004021ce in main (argc=24, argv=0x7ffec24d14e8) at /scratch/07065/tg863649/convection/test-a9-3-options_small_MAC_pressure_old/one.c:333 Best, Sepideh Sent from Mail for Windows From: Barry Smith Sent: Friday, October 21, 2022 10:54 AM To: Sepideh Kavousi Cc: petsc-users at mcs.anl.gov Subject: Re: [External] [petsc-users] Periodic boundary condition The problem with the output below is it is not giving a clear indication where the crash occurred. #1 User provided function() line 0 in unknown file Run with the exact same options but also -start_in_debugger noxterm It should then crash in the debugger and you can type bt to see the backtrace of where it crashed, send that output. Barry Background: MatFDColoringSetUpBlocked_AIJ_Private() allocates the space that is used when evaluating the function multiple times to get the Jacobian entries. If the FormFunction writes into incorrect locations, then it will corrupt this memory that was allocated in MatFDColoringSetUpBlocked_AIJ_Private() . It does not mean necessarily that there is anything wrong in MatFDColoringSetUpBlocked_AIJ_Private() On Oct 21, 2022, at 12:32 AM, Sepideh Kavousi > wrote: Barry, I ran the code with -malloc_debug and added CHKMEMQ for all the lines inside formfunction. Following is the detail of error. Best, Sepideh [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run [0]PETSC ERROR: to get more information on the crash. [0]PETSC ERROR: PetscMallocValidate: error detected at PetscError() line 401 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/err.c [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Signal received [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 [0]PETSC ERROR: ./one.out on a skylake named c415-063.stampede2.tacc.utexas.edu by tg863649 Thu Oct 20 23:30:05 2022 [0]PETSC ERROR: Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" [0]PETSC ERROR: #1 User provided function() line 0 in unknown file [0]PETSC ERROR: Checking the memory for corruption. [0]PETSC ERROR: PetscMallocValidate: error detected at PetscSignalHandlerDefault() line 170 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/signal.c [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c application called MPI_Abort(MPI_COMM_WORLD, 50176059) - process 0 [unset]: readline failed Sent from Mail for Windows From: Barry Smith Sent: Thursday, October 20, 2022 10:27 PM To: Sepideh Kavousi Cc: petsc-users at mcs.anl.gov Subject: [External] Re: [petsc-users] Periodic boundary condition Some of the valgrind information does not appear to make sense PetscMemcpy() is not calling SNESSolve() so I suspect there must be some serious corruption of something to this impossible stack trace ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) From ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) I suggest you run with -malloc_debug instead of valgrind and see if any errors are reported. If so you can add the macro CHKMEMQ; inside your function evaluation where you write to memory to see if anything is writing to the wrong location. For example wherever you assign aF such as aF[j][i].vx=(x3+x4+x5+x6+x7+x8+x9-x1-x2)*user->hx; this can help you determine the exact line number where you are writing to the wrong location and determine what might be the cause. On Oct 20, 2022, at 6:45 PM, Sepideh Kavousi > wrote: Hello, I want to solve my 5 PDEs based on finite difference method using periodic BC in x-direction and non-periodic in y-direction but I run into error (Segmentation Violation, probably memory access out of range). For this, I discretize my equation in FormFunction function. My PDE discretization in (i,j) node needs data on (i+1,j), (i+2,j), (i-1,j), (i-2,j), (i,j+1), (i,j+2), (i,j-1), (i,j-2) points. In my previous codes that the x-direction was non-periodic (no flux) boundary condition, I: i) implemented the no flux BC for i=0 and i=Nx-1, ii) set i+2= Nx-1 in discretizing (Nx-2,j) and i+2= 0 in discretizing (1,j) iii) discretized my equation for i=1..Nx-2. I am not sure how I should do the periodic BC. From the following discussions (https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html andhttps://lists.mcs.anl.gov/pipermail/petsc-users/2016-May/029273.html), I guess I should not do step (i) (stated above) for the x-boundaries and just do step (iii) for i=0..Nx-1. If I just focus on solving 2 of the PDEs which does need data on (i+2,j), (i-2,j), (i,j+2), (i,j-2) points for discretizing equation in (i,j) node, I still run into error: Running with Valgrind (just 1 processor) gave the following file. I did not find any information which gives me hint on the error source. Can you please help me to find the error? Best, Sepideh ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x4C29E39: malloc (vg_replace_malloc.c:309) ==236074== by 0x1B79E59B: MPID_Init (mpid_init.c:1649) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1B1DA260: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1B1DA383: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E48032E: __intel_sse4_strcpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FD8BE: PetscStrcpy (str.c:354) ==236074== by 0x51FD7A3: PetscStrallocpy (str.c:188) ==236074== by 0x52A39CE: PetscEventRegLogRegister (eventlog.c:313) ==236074== by 0x527D89A: PetscLogEventRegister (plog.c:693) ==236074== by 0x6A56A20: PCBDDCInitializePackage (bddc.c:3115) ==236074== by 0x6E1A515: PCInitializePackage (dlregisksp.c:59) ==236074== by 0x6DB1A86: PCCreate (precon.c:382) ==236074== by 0x6E05167: KSPGetPC (itfunc.c:1837) ==236074== by 0x6E0FC5C: KSPSetDM (iterativ.c:1150) ==236074== by 0x6FDD27B: SNESSetDM (snes.c:5402) ==236074== by 0x70B85F7: TSGetSNES (ts.c:2914) ==236074== by 0x70BE430: TSSetDM (ts.c:4949) ==236074== by 0x402496: main (one.c:378) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) ==236074== by 0x52C9958: PetscViewerFileSetName (filev.c:659) ==236074== by 0x52B743B: PetscViewerVTKOpen (vtkv.c:279) ==236074== by 0x70C76E6: TSMonitorSolutionVTK (ts.c:5580) ==236074== by 0x40313C: FormFunction (one.c:120) ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) ==236074== by 0x5224E4B: PetscFOpen (mpiuopen.c:52) ==236074== by 0x63A074B: DMDAVTKWriteAll_VTS.A (grvtk.c:72) ==236074== by 0x639A589: DMDAVTKWriteAll (grvtk.c:545) ==236074== by 0x52B66F3: PetscViewerFlush_VTK (vtkv.c:100) ==236074== by 0x52CFAAE: PetscViewerFlush (flush.c:26) ==236074== by 0x52CEA95: PetscViewerDestroy (view.c:113) ==236074== by 0x70C7717: TSMonitorSolutionVTK (ts.c:5582) ==236074== by 0x40313C: FormFunction (one.c:120) ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Invalid write of size 4 ==236074== at 0x5F10983: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:150) ==236074== by 0x5F10983: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Address 0x3a94fa80 is 0 bytes after a block of size 73,960,000 alloc'd ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) ==236074== by 0x52305F9: PetscMallocA (mal.c:418) ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Invalid write of size 8 ==236074== at 0x5F10991: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:151) ==236074== by 0x5F10991: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Address 0x3a94fa88 is 8 bytes after a block of size 73,960,000 alloc'd ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) ==236074== by 0x52305F9: PetscMallocA (mal.c:418) ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Sent from Mail for Windows -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: common.c URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: common.h URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: makefile Type: application/octet-stream Size: 239 bytes Desc: makefile URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: one.c URL: From alexlindsay239 at gmail.com Thu Nov 3 18:51:40 2022 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Thu, 3 Nov 2022 16:51:40 -0700 Subject: [petsc-users] Local columns of A10 do not equal local rows of A00 Message-ID: I have errors on quite a few (but not all) processes of the like [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [1]PETSC ERROR: Nonconforming object sizes [1]PETSC ERROR: Local columns of A10 4137 do not equal local rows of A00 4129 when performing field splits. We (MOOSE) have some code for identifying the index sets for each split. However, the code was written by some authors who are no longer with us. Normally I would chase this down in a debugger, but this error only seems to crop up for pretty complex and large meshes. If anyone has an idea for what we might be doing wrong, that might help me chase this down faster. I guess intuitively I'm pretty perplexed that we could get ourselves into this pickle as it almost appears that we have two different local dof index counts for a given block (0 in this case). More background, if helpful, can be found in https://github.com/idaholab/moose/issues/22359 as well as https://github.com/idaholab/moose/discussions/22468. I should note that we are currently running with 3.16.6 as our PETSc submodule hash (we are talking about updating to 3.18 soon). -------------- next part -------------- An HTML attachment was scrubbed... URL: From kavousi at mines.edu Thu Nov 3 18:53:00 2022 From: kavousi at mines.edu (Sepideh Kavousi) Date: Thu, 3 Nov 2022 23:53:00 +0000 Subject: [petsc-users] [External] Periodic boundary condition In-Reply-To: <8213DE9B-FD24-42F5-A293-5F124764937D@petsc.dev> References: <8213DE9B-FD24-42F5-A293-5F124764937D@petsc.dev> Message-ID: I am wondering is setting DM_BOUNDARY_PERIODIC in x direction and stencil width to be 3 in DMDACreate2d the reason for the segmentation error ? I just realized if I set stencil width to be 2, I do not get segmentation error. The reason I chose stencil width to be 3 is because I need information on i+1,i+2,i-1,i-2 for solving the equations on (i,j) node. Best, Sepideh Sent from Mail for Windows From: Barry Smith Sent: Thursday, November 3, 2022 12:18 PM To: Sepideh Kavousi Cc: petsc-users at mcs.anl.gov Subject: Re: [External] [petsc-users] Periodic boundary condition Can you send the code that just does what you indicate below in the FormFunction() and crashes? Then I can run it directly and track down the issue. Barry On Nov 3, 2022, at 10:56 AM, Sepideh Kavousi wrote: Barry, Even for the case that I am not solving any PDE equations in the FormFunction ( by setting : aF[j][i].vx=aY[j][i].vx; aF[j][i].vy=aY[j][i].vy; aF[j][i].pp=aY[j][i].pp; aF[j][i].U=aY[j][i].U aF[j][i].p=aY[j][i].p; ) I will run into segmentation error. Let me just follow what you suggested in the following link: https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html src/ksp/ksp/tutorials/ex45.c runs perfectly, but when I change bc along x direction from DM_BOUNDARY_NONE to DM_BOUNDARY_PERIODIC and delete (i==0 || i==mx-1) from if (i==0 || j==0 || k==0 || i==mx-1 || j==my-1 || k==mz-1), I run to the following error. I am not sure how else should I implement periodic bc in a problem. [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Object is in wrong state [0]PETSC ERROR: Matrix is missing diagonal entry 5 [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [1]PETSC ERROR: Object is in wrong state [1]PETSC ERROR: Matrix is missing diagonal entry 5 [1]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [2]PETSC ERROR: Object is in wrong state [2]PETSC ERROR: Matrix is missing diagonal entry 5 [2]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [2]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 [2]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [3]PETSC ERROR: Object is in wrong state [3]PETSC ERROR: Matrix is missing diagonal entry 5 [3]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [3]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 [3]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 [3]PETSC ERROR: [0]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 [0]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 [0]PETSC ERROR: Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" Petsc Release Version 3.14.2, Dec 03, 2020 [1]PETSC ERROR: /scratch/07065/tg863649/convection/periodic_test/one.out on a skylake named c402-092.stampede2.tacc.utexas.edu by tg863649 Thu Nov 3 09:53:00 2022 [1]PETSC ERROR: [0]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" [2]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" [3]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" [1]PETSC ERROR: #1 MatILUFactorSymbolic_SeqAIJ() line 1686 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/aijfact.c [0]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c [1]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c [1]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c [2]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c [2]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c [2]PETSC ERROR: [3]PETSC ERROR: #2 MatILUFactorSymbolic() line 6710 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/interface/matrix.c [3]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c [3]PETSC ERROR: #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [3]PETSC ERROR: [0]PETSC ERROR: #3 PCSetUp_ILU() line 141 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/factor/ilu/ilu.c [0]PETSC ERROR: #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [0]PETSC ERROR: #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [1]PETSC ERROR: #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c #4 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [2]PETSC ERROR: #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c #5 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: [2]PETSC ERROR: #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: [3]PETSC ERROR: #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: #6 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c #7 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: [2]PETSC ERROR: #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [2]PETSC ERROR: [0]PETSC ERROR: #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [1]PETSC ERROR: #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [1]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [2]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [2]PETSC ERROR: #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [0]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [0]PETSC ERROR: [1]PETSC ERROR: #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: #8 DMDAGetFaceInterpolation() line 493 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [3]PETSC ERROR: #9 PCSetUp_Exotic() line 667 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/impls/wb/wb.c [3]PETSC ERROR: #10 PCSetUp() line 1009 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/pc/interface/precon.c [3]PETSC ERROR: #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c #11 KSPSetUp() line 406 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #12 KSPSolve_Private() line 658 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: #13 KSPSolve() line 889 in /home1/apps/intel18/impi18_0/petsc/3.14/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: [0]PETSC ERROR: #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c [1]PETSC ERROR: PETSc Option Table entries: [1]PETSC ERROR: -ksp_monitor_short [2]PETSC ERROR: #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c [2]PETSC ERROR: PETSc Option Table entries: [2]PETSC ERROR: -ksp_monitor_short [2]PETSC ERROR: [0]PETSC ERROR: PETSc Option Table entries: [0]PETSC ERROR: -ksp_monitor_short [0]PETSC ERROR: -ksp_type fgmres [0]PETSC ERROR: -mg_levels_ksp_max_it 1 [0]PETSC ERROR: -mg_levels_ksp_type gmres [0]PETSC ERROR: -mg_levels_pc_type bjacobi [1]PETSC ERROR: -ksp_type fgmres [1]PETSC ERROR: -mg_levels_ksp_max_it 1 [1]PETSC ERROR: -mg_levels_ksp_type gmres [1]PETSC ERROR: -mg_levels_pc_type bjacobi [1]PETSC ERROR: -pc_type exotic [1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- -ksp_type fgmres [2]PETSC ERROR: -mg_levels_ksp_max_it 1 [2]PETSC ERROR: -mg_levels_ksp_type gmres [2]PETSC ERROR: -mg_levels_pc_type bjacobi [2]PETSC ERROR: -pc_type exotic [2]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- [3]PETSC ERROR: #14 main() line 51 in /scratch/07065/tg863649/convection/periodic_test/one.c [3]PETSC ERROR: PETSc Option Table entries: [3]PETSC ERROR: -ksp_monitor_short [3]PETSC ERROR: -ksp_type fgmres [3]PETSC ERROR: -mg_levels_ksp_max_it 1 [3]PETSC ERROR: -mg_levels_ksp_type gmres [3]PETSC ERROR: -mg_levels_pc_type bjacobi [3]PETSC ERROR: -pc_type exotic [3]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- [0]PETSC ERROR: -pc_type exotic [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 application called MPI_Abort(MPI_COMM_SELF, 51073) - process 0 TACC: MPI job exited with code: 129 TACC: Shutdown complete. Exiting. Sent from Mail for Windows From: Barry Smith Sent: Tuesday, October 25, 2022 7:24 PM To: Sepideh Kavousi Cc: petsc-users at mcs.anl.gov Subject: Re: [External] [petsc-users] Periodic boundary condition Sorry I was not clear, at this point you need to type c for continue and then when it crashes in the debugger type bt Barry On Oct 25, 2022, at 6:37 PM, Sepideh Kavousi > wrote: Hello Barry, When I ran with , the error is about PetscInitialize line (Line 333). When I write bt multiple times, it just continues referring to this line. #0 0x00002b701cfed9fd in nanosleep () from /lib64/libc.so.6 #1 0x00002b701cfed894 in sleep () from /lib64/libc.so.6 #2 0x00002b70035fb4ae in PetscSleep (s=1) at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/utils/psleep.c:46 #3 0x00002b700364b8bb in PetscAttachDebugger () at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/adebug.c:405 #4 0x00002b700366cfcd in PetscOptionsCheckInitial_Private (help=0x7ffec24c7940 "\t") at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/objects/init.c:608 #5 0x00002b7003674cd6 in PetscInitialize (argc=0x7ffec24c7940, args=0x7ffec24c7940, file=0x0, help=0xffffffffffffffff
) at /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/objects/pinit.c:1025 #6 0x00000000004021ce in main (argc=24, argv=0x7ffec24d14e8) at /scratch/07065/tg863649/convection/test-a9-3-options_small_MAC_pressure_old/one.c:333 Best, Sepideh Sent from Mail for Windows From: Barry Smith Sent: Friday, October 21, 2022 10:54 AM To: Sepideh Kavousi Cc: petsc-users at mcs.anl.gov Subject: Re: [External] [petsc-users] Periodic boundary condition The problem with the output below is it is not giving a clear indication where the crash occurred. #1 User provided function() line 0 in unknown file Run with the exact same options but also -start_in_debugger noxterm It should then crash in the debugger and you can type bt to see the backtrace of where it crashed, send that output. Barry Background: MatFDColoringSetUpBlocked_AIJ_Private() allocates the space that is used when evaluating the function multiple times to get the Jacobian entries. If the FormFunction writes into incorrect locations, then it will corrupt this memory that was allocated in MatFDColoringSetUpBlocked_AIJ_Private() . It does not mean necessarily that there is anything wrong in MatFDColoringSetUpBlocked_AIJ_Private() On Oct 21, 2022, at 12:32 AM, Sepideh Kavousi > wrote: Barry, I ran the code with -malloc_debug and added CHKMEMQ for all the lines inside formfunction. Following is the detail of error. Best, Sepideh [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run [0]PETSC ERROR: to get more information on the crash. [0]PETSC ERROR: PetscMallocValidate: error detected at PetscError() line 401 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/err.c [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Signal received [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.14.2, Dec 03, 2020 [0]PETSC ERROR: ./one.out on a skylake named c415-063.stampede2.tacc.utexas.edu by tg863649 Thu Oct 20 23:30:05 2022 [0]PETSC ERROR: Configure options --with-x=0 -with-pic --with-make-np=12 --download-petsc4py=1 --with-python-exec=/opt/apps/intel18/python2/2.7.16/bin/python2 --with-packages-build-dir=/tmp/petsc-3.14/skylake --with-mpi=1 --with-mpi-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64 --with-scalar-type=real --with-shared-libraries=1 --with-precision=double --with-chaco=1 --download-chaco --with-hypre=1 --download-hypre --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-plapack=1 --download-plapack --with-spai=1 --download-spai --with-sundials=1 --download-sundials --with-elemental=1 --download-elemental --with-cxx-dialect=C++11 --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-fftw=1 --with-fftw-dir=/opt/apps/intel18/impi18_0/fftw3/3.3.8 --with-hdf5=1 --with-hdf5-dir=/opt/apps/intel18/impi18_0/phdf5/1.10.4/x86_64 --download-hpddm --download-slepc --with-mumps=1 --download-mumps --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-scalapack=1 --download-scalapack --with-blacs=1 --download-blacs --with-spooles=1 --download-spooles --with-suitesparse=1 --download-suitesparse --with-superlu_dist=1 --download-superlu_dist --with-superlu=1 --download-superlu --with-parmetis=1 --download-parmetis --with-metis=1 --download-metis --with-zoltan=1 --download-zoltan=1 --download-ptscotch=1 --with-debugging=no --LIBS= --with-blaslapack-dir=/opt/intel/compilers_and_libraries_2018.2.199/linux/mkl COPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" FOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" CXXOPTFLAGS="-xCORE-AVX2 -axMIC-AVX512,CORE-AVX512 -O2 -g" [0]PETSC ERROR: #1 User provided function() line 0 in unknown file [0]PETSC ERROR: Checking the memory for corruption. [0]PETSC ERROR: PetscMallocValidate: error detected at PetscSignalHandlerDefault() line 170 in /home1/apps/intel18/impi18_0/petsc/3.14/src/sys/error/signal.c [0]PETSC ERROR: Memory [id=0(73960000)] at address 0x2b5aed6ab050 is corrupted (probably write past end of array) [0]PETSC ERROR: Memory originally allocated in MatFDColoringSetUpBlocked_AIJ_Private() line 125 in /home1/apps/intel18/impi18_0/petsc/3.14/src/mat/impls/aij/seq/fdaij.c application called MPI_Abort(MPI_COMM_WORLD, 50176059) - process 0 [unset]: readline failed Sent from Mail for Windows From: Barry Smith Sent: Thursday, October 20, 2022 10:27 PM To: Sepideh Kavousi Cc: petsc-users at mcs.anl.gov Subject: [External] Re: [petsc-users] Periodic boundary condition Some of the valgrind information does not appear to make sense PetscMemcpy() is not calling SNESSolve() so I suspect there must be some serious corruption of something to this impossible stack trace ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) From ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) I suggest you run with -malloc_debug instead of valgrind and see if any errors are reported. If so you can add the macro CHKMEMQ; inside your function evaluation where you write to memory to see if anything is writing to the wrong location. For example wherever you assign aF such as aF[j][i].vx=(x3+x4+x5+x6+x7+x8+x9-x1-x2)*user->hx; this can help you determine the exact line number where you are writing to the wrong location and determine what might be the cause. On Oct 20, 2022, at 6:45 PM, Sepideh Kavousi > wrote: Hello, I want to solve my 5 PDEs based on finite difference method using periodic BC in x-direction and non-periodic in y-direction but I run into error (Segmentation Violation, probably memory access out of range). For this, I discretize my equation in FormFunction function. My PDE discretization in (i,j) node needs data on (i+1,j), (i+2,j), (i-1,j), (i-2,j), (i,j+1), (i,j+2), (i,j-1), (i,j-2) points. In my previous codes that the x-direction was non-periodic (no flux) boundary condition, I: i) implemented the no flux BC for i=0 and i=Nx-1, ii) set i+2= Nx-1 in discretizing (Nx-2,j) and i+2= 0 in discretizing (1,j) iii) discretized my equation for i=1..Nx-2. I am not sure how I should do the periodic BC. From the following discussions (https://lists.mcs.anl.gov/pipermail/petsc-users/2012-May/013476.html andhttps://lists.mcs.anl.gov/pipermail/petsc-users/2016-May/029273.html), I guess I should not do step (i) (stated above) for the x-boundaries and just do step (iii) for i=0..Nx-1. If I just focus on solving 2 of the PDEs which does need data on (i+2,j), (i-2,j), (i,j+2), (i,j-2) points for discretizing equation in (i,j) node, I still run into error: Running with Valgrind (just 1 processor) gave the following file. I did not find any information which gives me hint on the error source. Can you please help me to find the error? Best, Sepideh ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x4C29E39: malloc (vg_replace_malloc.c:309) ==236074== by 0x1B79E59B: MPID_Init (mpid_init.c:1649) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218341C7: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x400F9C2: _dl_init (in /usr/lib64/ld-2.17.so) ==236074== by 0x401459D: dl_open_worker (in /usr/lib64/ld-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x4013B8A: _dl_open (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA4FAA: dlopen_doit (in /usr/lib64/libdl-2.17.so) ==236074== by 0x400F7D3: _dl_catch_error (in /usr/lib64/ld-2.17.so) ==236074== by 0x1AEA55AC: _dlerror_run (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1AEA5040: dlopen@@GLIBC_2.2.5 (in /usr/lib64/libdl-2.17.so) ==236074== by 0x1B8198DC: MPID_nem_ofi_init (ofi_init.c:158) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B805: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B810: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218323C1: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x218323C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323CF: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Use of uninitialised value of size 8 ==236074== at 0x218323E5: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218343EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2180E4F3: psm2_init (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112D7E6: psmx2_getinfo (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x210AC753: fi_getinfo@@FABRIC_1.2 (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B819AB7: MPID_nem_ofi_init (ofi_init.c:245) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21836F9A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21834872: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F7F5D: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F88C8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B69A: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x2183B7B8: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x218371EC: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x21837077: hfi_get_port_lid (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217F916B: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FDC29: ??? (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x217FCA91: psm2_ep_open (in /usr/lib64/libpsm2.so.2.2) ==236074== by 0x2112FB19: psmx2_trx_ctxt_alloc (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x21138089: psmx2_ep_open (in /opt/apps/libfabric/1.7.0/lib/libfabric.so.1.10.0) ==236074== by 0x1B81A6A1: fi_endpoint (fi_endpoint.h:155) ==236074== by 0x1B81A6A1: MPID_nem_ofi_init (ofi_init.c:377) ==236074== by 0x1B7B5C55: ??? (mpid_nem_init.c:231) ==236074== by 0x1B7B3F92: MPID_nem_init_ckpt (mpid_nem_init.c:954) ==236074== by 0x1B580640: MPIDI_CH3_Init (ch3_init.c:125) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1B1DA260: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1B1DA383: __I_MPI___intel_sse2_strncmp (in /opt/intel/compilers_and_libraries_2018.2.199/linux/mpi/intel64/lib/libmpifort.so.12.0) ==236074== by 0x1B8CFBA1: ??? (simple_pmi.c:2376) ==236074== by 0x1B8CBDAD: PMIi_InitIfSingleton (simple_pmi.c:2883) ==236074== by 0x1B8CBDAD: iPMI_KVS_Get (simple_pmi.c:751) ==236074== by 0x1B7CCC1E: ??? (mpidi_pg.c:949) ==236074== by 0x1B817EAA: MPID_nem_ofi_post_init (ofi_init.c:1736) ==236074== by 0x1B7B3575: MPID_nem_init_post (mpid_nem_init.c:1421) ==236074== by 0x1B5806E3: MPIDI_CH3_Init (ch3_init.c:146) ==236074== by 0x1B79F02D: MPID_Init (mpid_init.c:1857) ==236074== by 0x1B73FAEA: MPIR_Init_thread (initthread.c:717) ==236074== by 0x1B73D795: PMPI_Init_thread (initthread.c:1061) ==236074== by 0x5264A94: PetscInitialize (pinit.c:907) ==236074== by 0x40219D: main (one.c:335) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E48032E: __intel_sse4_strcpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FD8BE: PetscStrcpy (str.c:354) ==236074== by 0x51FD7A3: PetscStrallocpy (str.c:188) ==236074== by 0x52A39CE: PetscEventRegLogRegister (eventlog.c:313) ==236074== by 0x527D89A: PetscLogEventRegister (plog.c:693) ==236074== by 0x6A56A20: PCBDDCInitializePackage (bddc.c:3115) ==236074== by 0x6E1A515: PCInitializePackage (dlregisksp.c:59) ==236074== by 0x6DB1A86: PCCreate (precon.c:382) ==236074== by 0x6E05167: KSPGetPC (itfunc.c:1837) ==236074== by 0x6E0FC5C: KSPSetDM (iterativ.c:1150) ==236074== by 0x6FDD27B: SNESSetDM (snes.c:5402) ==236074== by 0x70B85F7: TSGetSNES (ts.c:2914) ==236074== by 0x70BE430: TSSetDM (ts.c:4949) ==236074== by 0x402496: main (one.c:378) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) ==236074== by 0x52C9958: PetscViewerFileSetName (filev.c:659) ==236074== by 0x52B743B: PetscViewerVTKOpen (vtkv.c:279) ==236074== by 0x70C76E6: TSMonitorSolutionVTK (ts.c:5580) ==236074== by 0x40313C: FormFunction (one.c:120) ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x1E4782BA: __intel_ssse3_strncpy (in /opt/intel/compilers_and_libraries_2018.2.199/linux/compiler/lib/intel64_lin/libirc.so) ==236074== by 0x51FFD24: PetscStrncpy (str.c:392) ==236074== by 0x51FEB03: PetscStrreplace (str.c:1142) ==236074== by 0x5224E4B: PetscFOpen (mpiuopen.c:52) ==236074== by 0x63A074B: DMDAVTKWriteAll_VTS.A (grvtk.c:72) ==236074== by 0x639A589: DMDAVTKWriteAll (grvtk.c:545) ==236074== by 0x52B66F3: PetscViewerFlush_VTK (vtkv.c:100) ==236074== by 0x52CFAAE: PetscViewerFlush (flush.c:26) ==236074== by 0x52CEA95: PetscViewerDestroy (view.c:113) ==236074== by 0x70C7717: TSMonitorSolutionVTK (ts.c:5582) ==236074== by 0x40313C: FormFunction (one.c:120) ==236074== by 0x7066531: TSComputeIFunction_DMDA (dmdats.c:82) ==236074== by 0x70BA5EF: TSComputeIFunction (ts.c:857) ==236074== by 0x711E2DC: SNESTSFormFunction_BDF (bdf.c:368) ==236074== by 0x70C6E46: SNESTSFormFunction (ts.c:5014) ==236074== by 0x6FDC8A6: SNESComputeFunction (snes.c:2383) ==236074== by 0x7023556: SNESSolve_NEWTONTR (tr.c:297) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== ==236074== Conditional jump or move depends on uninitialised value(s) ==236074== at 0x5F10977: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:146) ==236074== by 0x5F10977: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Invalid write of size 4 ==236074== at 0x5F10983: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:150) ==236074== by 0x5F10983: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Address 0x3a94fa80 is 0 bytes after a block of size 73,960,000 alloc'd ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) ==236074== by 0x52305F9: PetscMallocA (mal.c:418) ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== ==236074== Invalid write of size 8 ==236074== at 0x5F10991: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:151) ==236074== by 0x5F10991: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Address 0x3a94fa88 is 8 bytes after a block of size 73,960,000 alloc'd ==236074== at 0x4C2C480: memalign (vg_replace_malloc.c:909) ==236074== by 0x522FFE2: PetscMallocAlign (mal.c:52) ==236074== by 0x52305F9: PetscMallocA (mal.c:418) ==236074== by 0x5F10778: MatFDColoringSetUpBlocked_AIJ_Private (fdaij.c:125) ==236074== by 0x5F10778: MatFDColoringSetUp_SeqXAIJ.A (fdaij.c:284) ==236074== by 0x585892A: MatFDColoringSetUp (fdmatrix.c:242) ==236074== by 0x6FE5037: SNESComputeJacobianDefaultColor (snesj2.c:79) ==236074== by 0x6FC8E4E: SNESComputeJacobian (snes.c:2717) ==236074== by 0x70236E7: SNESSolve_NEWTONTR (tr.c:324) ==236074== by 0x6FD160F: SNESSolve (snes.c:4569) ==236074== by 0x711917E: PetscMemcpy (bdf.c:223) ==236074== by 0x711917E: PetscCitationsRegister (petscsys.h:2689) ==236074== by 0x711917E: TSStep_BDF.A (bdf.c:265) ==236074== by 0x70C363A: TSStep (ts.c:3757) ==236074== by 0x70C1999: TSSolve (ts.c:4154) ==236074== by 0x402594: main (one.c:391) ==236074== Sent from Mail for Windows -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 3 19:19:18 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 3 Nov 2022 20:19:18 -0400 Subject: [petsc-users] Local columns of A10 do not equal local rows of A00 In-Reply-To: References: Message-ID: On Thu, Nov 3, 2022 at 7:52 PM Alexander Lindsay wrote: > I have errors on quite a few (but not all) processes of the like > > [1]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [1]PETSC ERROR: Nonconforming object sizes > [1]PETSC ERROR: Local columns of A10 4137 do not equal local rows of A00 > 4129 > > when performing field splits. We (MOOSE) have some code for identifying > the index sets for each split. However, the code was written by some > authors who are no longer with us. Normally I would chase this down in a > debugger, but this error only seems to crop up for pretty complex and large > meshes. If anyone has an idea for what we might be doing wrong, that might > help me chase this down faster. I guess intuitively I'm pretty perplexed > that we could get ourselves into this pickle as it almost appears that we > have two different local dof index counts for a given block (0 in this > case). More background, if helpful, can be found in > https://github.com/idaholab/moose/issues/22359 as well as > https://github.com/idaholab/moose/discussions/22468. > How are you specifying the blocks? I would not have thought this was possible. Thanks, Matt > I should note that we are currently running with 3.16.6 as our PETSc > submodule hash (we are talking about updating to 3.18 soon). > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 3 19:36:12 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 3 Nov 2022 20:36:12 -0400 Subject: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh In-Reply-To: References: Message-ID: On Thu, Oct 27, 2022 at 11:57 AM Semplice Matteo < matteo.semplice at uninsubria.it> wrote: > Dear Petsc developers, > I am trying to use a DMSwarm to locate a cloud of points with respect > to a background mesh. In the real application the points will be loaded > from disk, but I have created a small demo in which > > - each processor creates Npart particles, all within the domain > covered by the mesh, but not all in the local portion of the mesh > - migrate the particles > > After migration most particles are not any more in the DMSwarm (how many > and which ones seems to depend on the number of cpus, but it never happens > that all particle survive the migration process). > > I am clearly missing some step, since I'd expect that a DMDA would be able > to locate particles without the need to go through a DMShell as it is done > in src/dm/tutorials/swarm_ex3.c.html > > > I attach my demo code. > > Could someone give me a hint? > Thanks for sending this. I found the problem. Someone has some overly fancy code inside DMDA to figure out the local bounding box from the coordinates. It is broken for DM_BOUNDARY_GHOSTED, but we never tested with this. I will fix it. Thanks, Matt > Best > Matteo > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 3 20:43:28 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 3 Nov 2022 21:43:28 -0400 Subject: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh In-Reply-To: References: Message-ID: On Thu, Nov 3, 2022 at 8:36 PM Matthew Knepley wrote: > On Thu, Oct 27, 2022 at 11:57 AM Semplice Matteo < > matteo.semplice at uninsubria.it> wrote: > >> Dear Petsc developers, >> I am trying to use a DMSwarm to locate a cloud of points with respect >> to a background mesh. In the real application the points will be loaded >> from disk, but I have created a small demo in which >> >> - each processor creates Npart particles, all within the domain >> covered by the mesh, but not all in the local portion of the mesh >> - migrate the particles >> >> After migration most particles are not any more in the DMSwarm (how many >> and which ones seems to depend on the number of cpus, but it never happens >> that all particle survive the migration process). >> >> I am clearly missing some step, since I'd expect that a DMDA would be >> able to locate particles without the need to go through a DMShell as it is >> done in src/dm/tutorials/swarm_ex3.c.html >> >> >> I attach my demo code. >> >> Could someone give me a hint? >> > > Thanks for sending this. I found the problem. Someone has some overly > fancy code inside DMDA to figure out the local bounding box from the > coordinates. > It is broken for DM_BOUNDARY_GHOSTED, but we never tested with this. I > will fix it. > Okay, I think this fix is correct https://gitlab.com/petsc/petsc/-/merge_requests/5802 I incorporated your test as src/dm/impls/da/tests/ex1.c. Can you take a look and see if this fixes your issue? Thanks, Matt > Thanks, > > Matt > > >> Best >> Matteo >> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Thu Nov 3 21:10:20 2022 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 3 Nov 2022 22:10:20 -0400 Subject: [petsc-users] [petsc-maint] Issues linking petsc header files and lib from FORTRAN codes In-Reply-To: References: Message-ID: Please send your attempted makefile and we'll see if we can get it working. I am not sure if we can organize the include files as Fortran compiler include files easily. We've always used the preprocessor approach. The Intel compiler docs indicate the procedure for finding the Fortran compiler include files https://www.intel.com/content/www/us/en/develop/documentation/fortran-compiler-oneapi-dev-guide-and-reference/top/program-structure/use-include-files.html is the same as for the preprocessor include files so I don't understand how the using the Fortran compiler include file approach would make the makefiles any simpler for users? Barry > On Nov 3, 2022, at 8:58 PM, Jianbo Long wrote: > > Hello, > > I'm struggling to make my FORTRAN code work with petsc as I cannot link the required header files (e.g., petscksp.h) and compiled library files to my FORTRAN code. > > Compiling petsc was not a problem. However, even with the fortran examples (see those on https://petsc.org/main/docs/manual/fortran/) and the guide on using petsc in c++ and fortran codes (see Section "Writing C/C++ or Fortran Applications" at https://petsc.org/main/docs/manual/getting_started/), I still cannot make my FORTRAN code work. > > The Fortran test code is exactly the example code ex83f.F90 (see attached files). Aftering following the 2nd method in the Guide (see the picture below), I still get errors: > > petsc/finclude/petscksp.h: No such file or directory > > Even if I set up the path of the header file correctly in my own makefile without using environment variables, I still can only find the file "petscksp.h" for my code. Of course, the trouble is that all other headers files required by KSP are recursively included in this petscksp.h file, and I have no way to link them together for my Fortran code. > > So, here are my questions: > 1) in the Guide, how exactly are we supposed to set up the environment variables PETSC_DIR and PETSC_ARCH ? More details and examples would be extremely helpful ! > 2) Is there a way to get rid of the preprocessor statement > #include > when using c++/Fortran codes ? > > For example, when using MUMPS package in a Fortran code, we can simply use compiler 'include', rather than a preprocessor, to extract all required variables for the user's codes : > INCLUDE 'zmumps_struc.h' > where the header file zmumps_struc.h is already provided in the package. Similarly, I think it's much more portable and easier when using petsc in other codes if we can make it work to use petsc. > > (Note: similar issues were discussed before, see https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2019-January/037499.html. Unfortunately, I have no clue about the solution archived there ...) > > Any thoughts and solutions would be much appreciated ! > > Thanks, > Jianbo Long > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From longtuteng249 at gmail.com Thu Nov 3 19:58:12 2022 From: longtuteng249 at gmail.com (Jianbo Long) Date: Fri, 4 Nov 2022 01:58:12 +0100 Subject: [petsc-users] Issues linking petsc header files and lib from FORTRAN codes Message-ID: Hello, I'm struggling to make my FORTRAN code work with petsc as I cannot link the required header files (e.g., petscksp.h) and compiled library files to my FORTRAN code. Compiling petsc was not a problem. However, even with the fortran examples (see those on https://petsc.org/main/docs/manual/fortran/) and the guide on using petsc in c++ and fortran codes (see Section "Writing C/C++ or Fortran Applications" at https://petsc.org/main/docs/manual/getting_started/), I still cannot make my FORTRAN code work. The Fortran test code is exactly the example code ex83f.F90 (see attached files). Aftering following the 2nd method in the Guide (see the picture below), I still get errors: petsc/finclude/petscksp.h: No such file or directory Even if I set up the path of the header file correctly in my own makefile without using environment variables, I still can only find the file "petscksp.h" for my code. Of course, the trouble is that all other headers files required by KSP are recursively included in this petscksp.h file, and I have no way to link them together for my Fortran code. So, here are my questions: 1) in the Guide, how exactly are we supposed to set up the environment variables PETSC_DIR and PETSC_ARCH ? More details and examples would be extremely helpful ! 2) Is there a way to get rid of the *preprocessor* statement #include when using c++/Fortran codes ? For example, when using MUMPS package in a Fortran code, we can simply use *compiler* 'include', rather than a preprocessor, to extract all required variables for the user's codes : INCLUDE 'zmumps_struc.h' where the header file zmumps_struc.h is already provided in the package. Similarly, I think it's much more portable and easier when using petsc in other codes if we can make it work to use petsc. (Note: similar issues were discussed before, see https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2019-January/037499.html. Unfortunately, I have no clue about the solution archived there ...) Any thoughts and solutions would be much appreciated ! Thanks, Jianbo Long [image: image.png] -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 277283 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: ex83f.F90 Type: text/x-fortran Size: 4251 bytes Desc: not available URL: From balay at mcs.anl.gov Fri Nov 4 00:07:58 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 4 Nov 2022 00:07:58 -0500 (CDT) Subject: [petsc-users] [petsc-maint] Issues linking petsc header files and lib from FORTRAN codes In-Reply-To: References: Message-ID: <38802c8b-fca9-a502-57d1-7d52062662ea@mcs.anl.gov> For ex83f.F90: >>>>> balay at p1 /home/balay/test $ ls ex83f.F90 balay at p1 /home/balay/test $ ls ex83f.F90 balay at p1 /home/balay/test $ export PETSC_DIR=$HOME/petsc balay at p1 /home/balay/test $ cp $PETSC_DIR/src/ksp/ksp/tests/makefile . balay at p1 /home/balay/test $ make ex83f mpif90 -fPIC -Wall -ffree-line-length-none -ffree-line-length-0 -Wno-lto-type-mismatch -Wno-unused-dummy-argument -g -O0 -I/home/balay/petsc/include -I/home/balay/petsc/arch-linux-c-debug/include ex83f.F90 -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib -L/home/balay/petsc/arch-linux-c-debug/lib -Wl,-rpath,/home/balay/soft/mpich-4.0.1/lib -L/home/balay/soft/mpich-4.0.1/lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/12 -L/usr/lib/gcc/x86_64-redhat-linux/12 -lpetsc -llapack -lblas -lm -lX11 -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lstdc++ -ldl -o ex83f balay at p1 /home/balay/test $ <<<<<< Also when you are adding PETSc to your current project - are you using source files with .f or .f90 suffix? If so rename them to .F or .F90 suffix. If you still have issues send more details - As Barry indicated - the makefile [with the sources compiled by this makefile] - and the compile log when you attempt to build these sources with this makefile. Satish On Thu, 3 Nov 2022, Barry Smith wrote: > > Please send your attempted makefile and we'll see if we can get it working. > > I am not sure if we can organize the include files as Fortran compiler include files easily. We've always used the preprocessor approach. The Intel compiler docs indicate the procedure for finding the Fortran compiler include files https://www.intel.com/content/www/us/en/develop/documentation/fortran-compiler-oneapi-dev-guide-and-reference/top/program-structure/use-include-files.html is the same as for the preprocessor include files so I don't understand how the using the Fortran compiler include file approach would make the makefiles any simpler for users? > > > Barry > > > > On Nov 3, 2022, at 8:58 PM, Jianbo Long wrote: > > > > Hello, > > > > I'm struggling to make my FORTRAN code work with petsc as I cannot link the required header files (e.g., petscksp.h) and compiled library files to my FORTRAN code. > > > > Compiling petsc was not a problem. However, even with the fortran examples (see those on https://petsc.org/main/docs/manual/fortran/) and the guide on using petsc in c++ and fortran codes (see Section "Writing C/C++ or Fortran Applications" at https://petsc.org/main/docs/manual/getting_started/), I still cannot make my FORTRAN code work. > > > > The Fortran test code is exactly the example code ex83f.F90 (see attached files). Aftering following the 2nd method in the Guide (see the picture below), I still get errors: > > > > petsc/finclude/petscksp.h: No such file or directory > > > > Even if I set up the path of the header file correctly in my own makefile without using environment variables, I still can only find the file "petscksp.h" for my code. Of course, the trouble is that all other headers files required by KSP are recursively included in this petscksp.h file, and I have no way to link them together for my Fortran code. > > > > So, here are my questions: > > 1) in the Guide, how exactly are we supposed to set up the environment variables PETSC_DIR and PETSC_ARCH ? More details and examples would be extremely helpful ! > > 2) Is there a way to get rid of the preprocessor statement > > #include > > when using c++/Fortran codes ? > > > > For example, when using MUMPS package in a Fortran code, we can simply use compiler 'include', rather than a preprocessor, to extract all required variables for the user's codes : > > INCLUDE 'zmumps_struc.h' > > where the header file zmumps_struc.h is already provided in the package. Similarly, I think it's much more portable and easier when using petsc in other codes if we can make it work to use petsc. > > > > (Note: similar issues were discussed before, see https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2019-January/037499.html. Unfortunately, I have no clue about the solution archived there ...) > > > > Any thoughts and solutions would be much appreciated ! > > > > Thanks, > > Jianbo Long > > > > > > > > From edoardo.alinovi at gmail.com Fri Nov 4 02:31:12 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Fri, 4 Nov 2022 08:31:12 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> <0D3C529A-3B69-4F2D-8CE9-ED87BB482F2F@petsc.dev> Message-ID: Barry, Thanks, yes, I should pass 1 and not 3.... For some reason I have misunderstood the wording in the documentation and I have interpreted m and n as the number of row and columns of the block to insert, while I need to think of everything as divided by bs... Now I am with you!!! Many thanks and sorry to have been so dump! ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Fri Nov 4 03:32:13 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Fri, 4 Nov 2022 09:32:13 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> <0D3C529A-3B69-4F2D-8CE9-ED87BB482F2F@petsc.dev> Message-ID: It is working like a charm now! Is it mandatory to use VecSetValuesBlocked to assemble the rhs? Does the Vec need to be of any other type than VECMPI? I am assembling it like this: brhs(1:3-bdim) = this%Ueqn%bC(iElement,1:3-bdim) brhs(4-bdim) = this%Peqn%bC(iElement,1) call VecSetValuesBlocked(this%rhs, 1, mesh%cellGlobalAddr(iElement)-1, brhs, INSERT_VALUES, ierr) But I am getting into troubles: [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: *PetscSegBufferAlloc_Private* [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.18.0, Sep 30, 2022 [0]PETSC ERROR: flubio_coupled on a gnu named alienware by edo Fri Nov 4 09:31:03 2022 [0]PETSC ERROR: Configure options PETSC_ARCH=gnu FOPTFLAGS=-O3 COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -download-fblaslapack=1 -download-superlu_dist -download-mumps -download-hypre -download-metis -download-parmetis -download-scalapack -download-ml -download-slepc -download-hpddm -download-cmake -with-mpi-dir=/home/edo/software/openmpi-4.1.1/build/ [0]PETSC ERROR: #1 PetscMallocAlign() at /home/edo/software/petsc-3.18.0/src/sys/memory/mal.c:55 [0]PETSC ERROR: #2 PetscSegBufferAlloc_Private() at /home/edo/software/petsc-3.18.0/src/sys/utils/segbuffer.c:31 [0]PETSC ERROR: #3 PetscSegBufferGet() at /home/edo/software/petsc-3.18.0/src/sys/utils/segbuffer.c:94 [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [1]PETSC ERROR: General MPI error [1]PETSC ERROR: MPI error 1 MPI_ERR_BUFFER: invalid buffer pointer [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [1]PETSC ERROR: Petsc Release Version 3.18.0, Sep 30, 2022 [1]PETSC ERROR: flubio_coupled on a gnu named alienware by edo Fri Nov 4 09:31:03 2022 [1]PETSC ERROR: Configure options PETSC_ARCH=gnu FOPTFLAGS=-O3 COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -download-fblaslapack=1 -download-superlu_dist -download-mumps -download-hypre -download-metis -download-parmetis -download-scalapack -download-ml -download-slepc -download-hpddm -download-cmake -with-mpi-dir=/home/edo/software/openmpi-4.1.1/build/ [1]PETSC ERROR: #1 VecAssemblySend_MPI_Private() at /home/edo/software/petsc-3.18.0/src/vec/vec/impls/mpi/pbvec.c:133 [1]PETSC ERROR: #2 PetscCommBuildTwoSidedFReq_Reference() at /home/edo/software/petsc-3.18.0/src/sys/utils/mpits.c:314 [1]PETSC ERROR: #3 PetscCommBuildTwoSidedFReq() at /home/edo/software/petsc-3.18.0/src/sys/utils/mpits.c:526 [1]PETSC ERROR: [0]PETSC ERROR: #4 VecAssemblyRecv_MPI_Private() at /home/edo/software/petsc-3.18.0/src/vec/vec/impls/mpi/pbvec.c:164 [0]PETSC ERROR: #5 PetscCommBuildTwoSidedFReq_Reference() at /home/edo/software/petsc-3.18.0/src/sys/utils/mpits.c:320 [0]PETSC ERROR: #6 PetscCommBuildTwoSidedFReq() at /home/edo/software/petsc-3.18.0/src/sys/utils/mpits.c:526 [0]PETSC ERROR: #7 VecAssemblyBegin_MPI_BTS() at /home/edo/software/petsc-3.18.0/src/vec/vec/impls/mpi/pbvec.c:238 #4 VecAssemblyBegin_MPI_BTS() at /home/edo/software/petsc-3.18.0/src/vec/vec/impls/mpi/pbvec.c:238 [1]PETSC ERROR: #5 VecAssemblyBegin() at /home/edo/software/petsc-3.18.0/src/vec/vec/interface/vector.c:124 [1]PETSC ERROR: #6 VecAssemblyEnd_MPI_BTS() at /home/edo/software/petsc-3.18.0/src/vec/vec/impls/mpi/pbvec.c:337 [1]PETSC ERROR: #7 VecAssemblyEnd() at /home/edo/software/petsc-3.18.0/src/vec/vec/interface/vector.c:158 [1]PETSC ERROR: #8 VecView() at /home/edo/software/petsc-3.18.0/src/vec/vec/interface/vector.c:719 [0]PETSC ERROR: #8 VecAssemblyBegin() at /home/edo/software/petsc-3.18.0/src/vec/vec/interface/vector.c:124 Vec Object: 2 MPI processes -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Nov 4 05:51:42 2022 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 4 Nov 2022 06:51:42 -0400 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> <0D3C529A-3B69-4F2D-8CE9-ED87BB482F2F@petsc.dev> Message-ID: On Fri, Nov 4, 2022 at 4:32 AM Edoardo alinovi wrote: > It is working like a charm now! > > Is it mandatory to use VecSetValuesBlocked to assemble the rhs? Does the > Vec need to be of any other type than VECMPI? > SetValuesBlocked() is never required. You can always use the normal versions, but you would have to supply all the indices, not just the block indices. Thanks, Matt > I am assembling it like this: > brhs(1:3-bdim) = this%Ueqn%bC(iElement,1:3-bdim) > brhs(4-bdim) = this%Peqn%bC(iElement,1) > call VecSetValuesBlocked(this%rhs, 1, > mesh%cellGlobalAddr(iElement)-1, brhs, INSERT_VALUES, ierr) > > But I am getting into troubles: > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: *PetscSegBufferAlloc_Private* > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.18.0, Sep 30, 2022 > [0]PETSC ERROR: flubio_coupled on a gnu named alienware by edo Fri Nov 4 > 09:31:03 2022 > [0]PETSC ERROR: Configure options PETSC_ARCH=gnu FOPTFLAGS=-O3 > COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -download-fblaslapack=1 > -download-superlu_dist -download-mumps -download-hypre -download-metis > -download-parmetis -download-scalapack -download-ml -download-slepc > -download-hpddm -download-cmake > -with-mpi-dir=/home/edo/software/openmpi-4.1.1/build/ > [0]PETSC ERROR: #1 PetscMallocAlign() at > /home/edo/software/petsc-3.18.0/src/sys/memory/mal.c:55 > [0]PETSC ERROR: #2 PetscSegBufferAlloc_Private() at > /home/edo/software/petsc-3.18.0/src/sys/utils/segbuffer.c:31 > [0]PETSC ERROR: #3 PetscSegBufferGet() at > /home/edo/software/petsc-3.18.0/src/sys/utils/segbuffer.c:94 > [1]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [1]PETSC ERROR: General MPI error > [1]PETSC ERROR: MPI error 1 MPI_ERR_BUFFER: invalid buffer pointer > [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [1]PETSC ERROR: Petsc Release Version 3.18.0, Sep 30, 2022 > [1]PETSC ERROR: flubio_coupled on a gnu named alienware by edo Fri Nov 4 > 09:31:03 2022 > [1]PETSC ERROR: Configure options PETSC_ARCH=gnu FOPTFLAGS=-O3 > COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -download-fblaslapack=1 > -download-superlu_dist -download-mumps -download-hypre -download-metis > -download-parmetis -download-scalapack -download-ml -download-slepc > -download-hpddm -download-cmake > -with-mpi-dir=/home/edo/software/openmpi-4.1.1/build/ > [1]PETSC ERROR: #1 VecAssemblySend_MPI_Private() at > /home/edo/software/petsc-3.18.0/src/vec/vec/impls/mpi/pbvec.c:133 > [1]PETSC ERROR: #2 PetscCommBuildTwoSidedFReq_Reference() at > /home/edo/software/petsc-3.18.0/src/sys/utils/mpits.c:314 > [1]PETSC ERROR: #3 PetscCommBuildTwoSidedFReq() at > /home/edo/software/petsc-3.18.0/src/sys/utils/mpits.c:526 > [1]PETSC ERROR: [0]PETSC ERROR: #4 VecAssemblyRecv_MPI_Private() at > /home/edo/software/petsc-3.18.0/src/vec/vec/impls/mpi/pbvec.c:164 > [0]PETSC ERROR: #5 PetscCommBuildTwoSidedFReq_Reference() at > /home/edo/software/petsc-3.18.0/src/sys/utils/mpits.c:320 > [0]PETSC ERROR: #6 PetscCommBuildTwoSidedFReq() at > /home/edo/software/petsc-3.18.0/src/sys/utils/mpits.c:526 > [0]PETSC ERROR: #7 VecAssemblyBegin_MPI_BTS() at > /home/edo/software/petsc-3.18.0/src/vec/vec/impls/mpi/pbvec.c:238 > #4 VecAssemblyBegin_MPI_BTS() at > /home/edo/software/petsc-3.18.0/src/vec/vec/impls/mpi/pbvec.c:238 > [1]PETSC ERROR: #5 VecAssemblyBegin() at > /home/edo/software/petsc-3.18.0/src/vec/vec/interface/vector.c:124 > [1]PETSC ERROR: #6 VecAssemblyEnd_MPI_BTS() at > /home/edo/software/petsc-3.18.0/src/vec/vec/impls/mpi/pbvec.c:337 > [1]PETSC ERROR: #7 VecAssemblyEnd() at > /home/edo/software/petsc-3.18.0/src/vec/vec/interface/vector.c:158 > [1]PETSC ERROR: #8 VecView() at > /home/edo/software/petsc-3.18.0/src/vec/vec/interface/vector.c:719 > [0]PETSC ERROR: #8 VecAssemblyBegin() at > /home/edo/software/petsc-3.18.0/src/vec/vec/interface/vector.c:124 > Vec Object: 2 MPI processes > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Fri Nov 4 05:55:01 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Fri, 4 Nov 2022 11:55:01 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> <0D3C529A-3B69-4F2D-8CE9-ED87BB482F2F@petsc.dev> Message-ID: Thanks Matt, I have found out that setValuesblocked will work if I do: call MatCreateVecs(A, x, y, ierr) call setValuesBlocked(x, nblocks, varray, ierr) However, there is nogetValuesBlocked. Not the end of the world, it is handy to set and get stuff by block and not by single entry :) Cheers -------------- next part -------------- An HTML attachment was scrubbed... URL: From stephan.koehler at math.tu-freiberg.de Fri Nov 4 06:43:32 2022 From: stephan.koehler at math.tu-freiberg.de (=?UTF-8?Q?Stephan_K=c3=b6hler?=) Date: Fri, 4 Nov 2022 12:43:32 +0100 Subject: [petsc-users] Report Bug TaoALMM class In-Reply-To: <5E53FE56-5C68-4F06-8A48-54ACBDC800C7@petsc.dev> References: <4eec06f9-d534-7a02-9abe-6d1415f663f0@math.tu-freiberg.de> <14f2cdd6-9cbe-20a6-0c7d-3006b2ee4dc1@math.tu-freiberg.de> <5E53FE56-5C68-4F06-8A48-54ACBDC800C7@petsc.dev> Message-ID: <892a51c2-17f7-ac1f-f55d-05981978a4f4@math.tu-freiberg.de> Barry, this is a nonartificial code.? This is a problem in the ALMM subsolver.? I want to solve a problem with a TaoALMM solver what then happens is: TaoSolve(tao)??? /* TaoALMM solver */ ?? | ?? | ?? |-------->?? This calls the TaoALMM subsolver routine ???????????????? TaoSolve(subsolver) ?????????????????????? | ?????????????????????? | ?????????????????????? |----------->?? The subsolver does not correctly work, at least with an Armijo line search, since the solution is overwritten within the line search. ?????????????????????????????????????? In my case, the subsolver does not make any progress although it is possible. To get to my real problem you can simply change line 268 to if(0) (from if(1) -----> if(0)) and line 317 from // ierr = TaoSolve(tao); CHKERRQ(ierr);? -------> ierr = TaoSolve(tao); CHKERRQ(ierr); What you can see is that the solver does not make any progress, but it should make progress. To be honest, I do not really know why the option -tao_almm_subsolver_tao_ls_monitor has know effect if the ALMM solver is called and not the subsolver. I also do not know why -tao_almm_subsolver_tao_view prints as termination reason for the subsolver ???? Solution converged:??? ||g(X)|| <= gatol This is obviously not the case.? I set the tolerance -tao_almm_subsolver_tao_gatol 1e-8 \ -tao_almm_subsolver_tao_grtol 1e-8 \ I encountered this and then I looked into the ALMM class and therefore I tried to call the subsolver (previous example). I attach the updated programm and also the options. Stephan On 03.11.22 22:15, Barry Smith wrote: > > ? Thanks for your response and the code. I understand the potential > problem and how your code demonstrates a bug if the > TaoALMMSubsolverObjective() is used in the manner you use in the > example where you directly call?TaoComputeObjective() multiple times > line a line search code might. > > ? What I don't have or understand is how to reproduce the problem in a > real code that uses Tao. That is where the Tao Armijo?line search code > has a problem when it is used (somehow) in a Tao solver with?ALMM. You > suggest "If you have an example for your own, you can switch the > Armijo line search by the option -tao_ls_type armijo.? The thing is > that it will cause no problems if the line search accepts the steps > with step length one." ?I don't see how to do this if I use -tao_type > almm I cannot use -tao_ls_type armijo; that is the option -tao_ls_type > doesn't seem to me to be usable in the context of almm (since almm > internally does directly its own trust region approach for > globalization). If we remove the if (1) code from your example, is > there some Tao options I can use to get the bug to appear inside the > Tao solve? > > I'll try to explain again, I agree that the fact that the Tao solution > is aliased (within the ALMM solver) is a problem with repeated calls > to TaoComputeObjective() but I cannot see how these repeated calls > could ever happen in the use of TaoSolve() with the ALMM solver. That > is when is this "design problem" a true problem as opposed to just a > potential problem that can be demonstrated in artificial code? > > The reason I need to understand the non-artificial situation it breaks > things is to come up with an appropriate correction for the current code. > > ? Barry > > > > > > > >> On Nov 3, 2022, at 12:46 PM, Stephan K?hler >> wrote: >> >> Barry, >> >> so far, I have not experimented with trust-region methods, but I can >> imagine that this "design feature" causes no problem for trust-region >> methods, if the old point is saved and after the trust-region check >> fails the old point is copied to the actual point.? But the >> implementation of the Armijo line search method does not work that >> way.? Here, the actual point will always be overwritten.? Only if the >> line search fails, then the old point is restored, but then the >> TaoSolve method ends with a line search failure. >> >> If you have an example for your own, you can switch the Armijo line >> search by the option -tao_ls_type armijo.? The thing is that it will >> cause no problems if the line search accepts the steps with step >> length one. >> It is also possible that, by luck, it will cause no problems, if the >> "excessive" step brings a reduction of the objective >> >> Otherwise, I attach my example, which is not minimal, but here you >> can see that it causes problems.? You need to set the paths to the >> PETSc library in the makefile.? You find the options for this problem >> in the run_test_tao_neohooke.sh script. >> The import part begins at line 292 in test_tao_neohooke.cpp >> >> Stephan >> >> On 02.11.22 19:04, Barry Smith wrote: >>> Stephan, >>> >>> I have located the troublesome line in TaoSetUp_ALMM() it has the line >>> >>> auglag->Px = tao->solution; >>> >>> and in alma.h it has >>> >>> Vec Px, LgradX, Ce, Ci, G; /* aliased vectors (do not destroy!) */ >>> >>> Now auglag->P in some situations alias auglag->P and in some cases auglag->Px serves to hold a portion of auglag->P. So then in TaoALMMSubsolverObjective_Private() >>> the lines >>> >>> PetscCall(VecCopy(P, auglag->P)); >>> PetscCall((*auglag->sub_obj)(auglag->parent)); >>> >>> causes, just as you said, tao->solution to be overwritten by the P at which the objective function is being computed. In other words, the solution of the outer Tao is aliased with the solution of the inner Tao, by design. >>> >>> You are definitely correct, the use of TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private in a line search would be problematic. >>> >>> I am not an expert at these methods or their implementations. Could you point to an actual use case within Tao that triggers the problem. Is there a set of command line options or code calls to Tao that fail due to this "design feature". Within the standard use of ALMM I do not see how the objective function would be used within a line search. The TaoSolve_ALMM() code is self-correcting in that if a trust region check fails it automatically rolls back the solution. >>> >>> Barry >>> >>> >>> >>> >>>> On Oct 28, 2022, at 4:27 AM, Stephan K?hler wrote: >>>> >>>> Dear PETSc/Tao team, >>>> >>>> it seems to be that there is a bug in the TaoALMM class: >>>> >>>> In the methods TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private the vector where the function value for the augmented Lagrangian is evaluate >>>> is copied into the current solution, see, e.g.,https://petsc.org/release/src/tao/constrained/impls/almm/almm.c.html line 672 or 682. This causes subsolver routine to not converge if the line search for the subsolver rejects the step length 1. for some >>>> update. In detail: >>>> >>>> Suppose the current iterate is xk and the current update is dxk. The line search evaluates the augmented Lagrangian now at (xk + dxk). This causes that the value (xk + dxk) is copied in the current solution. If the point (xk + dxk) is rejected, the line search should >>>> try the point (xk + alpha * dxk), where alpha < 1. But due to the copying, what happens is that the point ((xk + dxk) + alpha * dxk) is evaluated, see, e.g.,https://petsc.org/release/src/tao/linesearch/impls/armijo/armijo.c.html line 191. >>>> >>>> Best regards >>>> Stephan K?hler >>>> >>>> -- >>>> Stephan K?hler >>>> TU Bergakademie Freiberg >>>> Institut f?r numerische Mathematik und Optimierung >>>> >>>> Akademiestra?e 6 >>>> 09599 Freiberg >>>> Geb?udeteil Mittelbau, Zimmer 2.07 >>>> >>>> Telefon: +49 (0)3731 39-3173 (B?ro) >>>> >>>> >> >> -- >> Stephan K?hler >> TU Bergakademie Freiberg >> Institut f?r numerische Mathematik und Optimierung >> >> Akademiestra?e 6 >> 09599 Freiberg >> Geb?udeteil Mittelbau, Zimmer 2.07 >> >> Telefon: +49 (0)3731 39-3173 (B?ro) >> > -- Stephan K?hler TU Bergakademie Freiberg Institut f?r numerische Mathematik und Optimierung Akademiestra?e 6 09599 Freiberg Geb?udeteil Mittelbau, Zimmer 2.07 Telefon: +49 (0)3731 39-3173 (B?ro) -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: run_test_tao_neohooke.sh Type: application/x-shellscript Size: 712 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: test_tao_neohooke.cpp Type: text/x-c++src Size: 19042 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: OpenPGP_0xC9BF2C20DFE9F713.asc Type: application/pgp-keys Size: 758 bytes Desc: OpenPGP public key URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: OpenPGP_signature Type: application/pgp-signature Size: 236 bytes Desc: OpenPGP digital signature URL: From matteo.semplice at uninsubria.it Fri Nov 4 06:46:21 2022 From: matteo.semplice at uninsubria.it (Matteo Semplice) Date: Fri, 4 Nov 2022 12:46:21 +0100 Subject: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh In-Reply-To: References: Message-ID: <8a853d0e-b856-5dc2-5439-d25911d672e4@uninsubria.it> On 04/11/2022 02:43, Matthew Knepley wrote: > On Thu, Nov 3, 2022 at 8:36 PM Matthew Knepley wrote: > > On Thu, Oct 27, 2022 at 11:57 AM Semplice Matteo > wrote: > > Dear Petsc developers, > I am trying to use a DMSwarm to locate a cloud of points with > respect to a background mesh. In the real application the > points will be loaded from disk, but I have created a small > demo in which > > * each processor creates Npart particles, all within the > domain covered by the mesh, but not all in the local > portion of the mesh > * migrate the particles > > After migration most particles are not any more in the DMSwarm > (how many and which ones seems to depend on the number of > cpus, but it never happens that all particle survive the > migration process). > > I am clearly missing some step, since I'd expect that a DMDA > would be able to locate particles without the need to go > through a DMShell as it is done in > src/dm/tutorials/swarm_ex3.c.html > > > I attach my demo code. > > Could someone give me a hint? > > > Thanks for sending this. I found the problem. Someone has some > overly fancy code inside DMDA to figure out the local bounding box > from the coordinates. > It is broken for DM_BOUNDARY_GHOSTED, but we never tested with > this. I will fix it. > > > Okay, I think this fix is correct > > https://gitlab.com/petsc/petsc/-/merge_requests/5802 > > > I incorporated your test as src/dm/impls/da/tests/ex1.c. Can you take > a look and see if this fixes your issue? Yes, we have tested 2d and 3d, with various combinations of DM_BOUNDARY_* along different directions and it works like a charm. On a side note, neither DMSwarmViewXDMF nor DMSwarmMigrate seem to be implemented for 1d: I get [0]PETSC ERROR: No support for this operation for this object type[0]PETSC ERROR: Support not provided for 1D However, currently I have no need for this feature. Finally, if the test is meant to stay in the source, you may remove the call to DMSwarmRegisterPetscDatatypeField as in the attached patch. Thanks a lot!! ??? Matteo and Silvia > > ? Thanks, > > ? ? ?Matt > > ? Thanks, > > ? ? ?Matt > > Best > ??? Matteo > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to > which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -- Prof. Matteo Semplice Universit? degli Studi dell?Insubria Dipartimento di Scienza e Alta Tecnologia ? DiSAT Professore Associato Via Valleggio, 11 ? 22100 Como (CO) ? Italia tel.: +39 031 2386316 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: patch_on_dmdatestlocateparticles.patch Type: text/x-patch Size: 608 bytes Desc: not available URL: From knepley at gmail.com Fri Nov 4 06:48:17 2022 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 4 Nov 2022 07:48:17 -0400 Subject: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh In-Reply-To: <8a853d0e-b856-5dc2-5439-d25911d672e4@uninsubria.it> References: <8a853d0e-b856-5dc2-5439-d25911d672e4@uninsubria.it> Message-ID: On Fri, Nov 4, 2022 at 7:46 AM Matteo Semplice < matteo.semplice at uninsubria.it> wrote: > On 04/11/2022 02:43, Matthew Knepley wrote: > > On Thu, Nov 3, 2022 at 8:36 PM Matthew Knepley wrote: > >> On Thu, Oct 27, 2022 at 11:57 AM Semplice Matteo < >> matteo.semplice at uninsubria.it> wrote: >> >>> Dear Petsc developers, >>> I am trying to use a DMSwarm to locate a cloud of points with >>> respect to a background mesh. In the real application the points will be >>> loaded from disk, but I have created a small demo in which >>> >>> - each processor creates Npart particles, all within the domain >>> covered by the mesh, but not all in the local portion of the mesh >>> - migrate the particles >>> >>> After migration most particles are not any more in the DMSwarm (how many >>> and which ones seems to depend on the number of cpus, but it never happens >>> that all particle survive the migration process). >>> >>> I am clearly missing some step, since I'd expect that a DMDA would be >>> able to locate particles without the need to go through a DMShell as it is >>> done in src/dm/tutorials/swarm_ex3.c.html >>> >>> >>> I attach my demo code. >>> >>> Could someone give me a hint? >>> >> >> Thanks for sending this. I found the problem. Someone has some overly >> fancy code inside DMDA to figure out the local bounding box from the >> coordinates. >> It is broken for DM_BOUNDARY_GHOSTED, but we never tested with this. I >> will fix it. >> > > Okay, I think this fix is correct > > https://gitlab.com/petsc/petsc/-/merge_requests/5802 > > > I incorporated your test as src/dm/impls/da/tests/ex1.c. Can you take a > look and see if this fixes your issue? > > Yes, we have tested 2d and 3d, with various combinations of DM_BOUNDARY_* > along different directions and it works like a charm. > > On a side note, neither DMSwarmViewXDMF nor DMSwarmMigrate seem to be > implemented for 1d: I get > > [0]PETSC ERROR: No support for this operation for this object type > [0]PETSC > ERROR: Support not provided for 1D > > However, currently I have no need for this feature. > > Finally, if the test is meant to stay in the source, you may remove the > call to DMSwarmRegisterPetscDatatypeField as in the attached patch. > > Thanks a lot!! > > Thanks! Glad it works. Matt > Matteo and Silvia > > > Thanks, > > Matt > > >> Thanks, >> >> Matt >> >> >>> Best >>> Matteo >>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > -- > Prof. Matteo Semplice > Universit? degli Studi dell?Insubria > Dipartimento di Scienza e Alta Tecnologia ? DiSAT > Professore Associato > Via Valleggio, 11 ? 22100 Como (CO) ? Italia > tel.: +39 031 2386316 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From dargaville.steven at gmail.com Fri Nov 4 09:50:31 2022 From: dargaville.steven at gmail.com (Steven Dargaville) Date: Fri, 4 Nov 2022 14:50:31 +0000 Subject: [petsc-users] PCMGSetResidual and fortran Message-ID: Hi all I have a quick question regarding the use of PCMGSetResidual within fortran code. I'm calling PCMGSetResidual from within fortran: call PCMGSetResidual(pc_mg, petsc_level, mg_residual, coarse_matrix ierr) and just for testing purposes I've written a trivial residual evaluation routine: subroutine mg_residual(mat, b, x, r, ierr) !!< Compute the residual ! ~~~~~~ type(tMat) :: mat type(tVec) :: b, x, r PetscErrorCode :: ierr ! ~~~~~~ print *, "inside residual evaluation" call MatResidual(mat, b, x, r, ierr) end subroutine The problem I am having is that this segfaults when the residual routine is called. Valgrind shows that it is failing in the fortran interface in ftn-custom/zmgfuncf.c, with the message: ==24742== Invalid read of size 8 ==24742== at 0x5B7CBC0: ourresidualfunction (in /home/projects/dependencies/petsc_main/arch-linux-c-opt/lib/libpetsc.so.3.015.0) ==24742== by 0x5B6D804: PCMGMCycle_Private (in /home/projects/dependencies/petsc_main/arch-linux-c-opt/lib/libpetsc.so.3.015.0) ==24742== Process terminating with default action of signal 11 (SIGSEGV) ==24742== Access not within mapped region at address 0x0 ==24742== at 0x5B7CBC0: ourresidualfunction (in /home/sdargavi/projects/dependencies/petsc_main/arch-linux-c-opt/lib/libpetsc.so.3.015.0) ==24742== by 0x5B6D804: PCMGMCycle_Private (in /home/sdargavi/projects/dependencies/petsc_main/arch-linux-c-opt/lib/libpetsc.so.3.015.0) I'm guessing this is because the fortran_func_pointers isn't pointing to the mg_residual routine, but I am not sure why. I noticed that in the C code of PCMGSetResidual that it calls MatDestroy on the A matrix in mg_levels and replaces it with the mat passed in: if (mat) PetscObjectReference((PetscObject)mat); MatDestroy(&mglevels[l]->A); mglevels[l]->A = mat so I modified my code to call PCMGSetResidual either before the operators are set, or after but passing in an extra copy, but this doesn't seem to help. I'm guessing I'm doing something silly, but just wondering if anyone had any ideas? Thanks for your help Steven -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Fri Nov 4 10:44:55 2022 From: bsmith at petsc.dev (Barry Smith) Date: Fri, 4 Nov 2022 11:44:55 -0400 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> <0D3C529A-3B69-4F2D-8CE9-ED87BB482F2F@petsc.dev> Message-ID: <9119F0A7-F279-45E6-9213-06D9907EC26F@petsc.dev> > On Nov 4, 2022, at 6:55 AM, Edoardo alinovi wrote: > > Thanks Matt, > > I have found out that setValuesblocked will work if I do: > > call MatCreateVecs(A, x, y, ierr) > call setValuesBlocked(x, nblocks, varray, ierr) Ah, likely the block size for the vector was not correct, leading to the memory corruption. MatCreateVecs() creates a vector compatible with the matrix, same block size and parallel layout so you don't need to worry about setting those values yourself. Barry > > However, there is nogetValuesBlocked. Not the end of the world, it is handy to set and get stuff by block and not by single entry :) > > Cheers From edoardo.alinovi at gmail.com Fri Nov 4 10:51:06 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Fri, 4 Nov 2022 16:51:06 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: <9119F0A7-F279-45E6-9213-06D9907EC26F@petsc.dev> References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> <0D3C529A-3B69-4F2D-8CE9-ED87BB482F2F@petsc.dev> <9119F0A7-F279-45E6-9213-06D9907EC26F@petsc.dev> Message-ID: Yes, I did not set the block size for the vector... Missed it! I think I have nailed the way to handle block matrix/vectors, I am moving now on to solve the next facy error which is a ksp_diverged_its ?? thanks gents for the support with this block madness :) -------------- next part -------------- An HTML attachment was scrubbed... URL: From mi.mike1021 at gmail.com Fri Nov 4 12:37:45 2022 From: mi.mike1021 at gmail.com (Mike Michell) Date: Fri, 4 Nov 2022 12:37:45 -0500 Subject: [petsc-users] Installing Triangle for PETSc by not using --download Message-ID: Hi, I need to install PETSc and its dependencies to a linux system, which does not allow git clone from online. Thus I need to install all the dependencies by having their source files. First I downloaded and installed all the dependencies by relying on PETSc on my local linux (which means I used --download=triangle), then tar all of them, and brought them to the cluster. There is an issue with Triangle. I can do make (or cmake using the CMakeList file that Triangle 1.6 provides, although PETSc --download provides Triangle 1.3) and can get "libtriangle.a". But during the PETSc configure step, it fails with the error message below: {$Triangle_Home}/build/libtriangle.a(triangle.c.o): In function `poolinit': triangle.c:(.text+0x15aa): undefined reference to `PetscTrMalloc' {$Triangle_Home}/build/libtriangle.a(triangle.c.o): In function `pooldeinit': triangle.c:(.text+0x1708): undefined reference to `PetscTrFree' ... It seems that I need to let Triangle know that it will be used by PETSc to enable those functions when I install Triangle. Is this correct understanding? If so, how can I configure, install, and link Triangle with PETSc? I have seen some PETSc related commands in configure.py in the root directory of Triangle downloaded by PETSc, but it is not clear how this python script is related or can be used by user like me. Thanks, Mike -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Fri Nov 4 12:38:10 2022 From: bsmith at petsc.dev (Barry Smith) Date: Fri, 4 Nov 2022 13:38:10 -0400 Subject: [petsc-users] PCMGSetResidual and fortran In-Reply-To: References: Message-ID: <7098E534-6376-46E4-B8F8-1E00F3EA7441@petsc.dev> Steven, Could you please send your test code. It is possible there is a bug in our Fortran interface since we do not test it for this functionality. Barry > On Nov 4, 2022, at 10:50 AM, Steven Dargaville wrote: > > Hi all > > I have a quick question regarding the use of PCMGSetResidual within fortran code. I'm calling PCMGSetResidual from within fortran: > > call PCMGSetResidual(pc_mg, petsc_level, mg_residual, coarse_matrix ierr) > > and just for testing purposes I've written a trivial residual evaluation routine: > > subroutine mg_residual(mat, b, x, r, ierr) > !!< Compute the residual > > ! ~~~~~~ > type(tMat) :: mat > type(tVec) :: b, x, r > PetscErrorCode :: ierr > ! ~~~~~~ > > print *, "inside residual evaluation" > call MatResidual(mat, b, x, r, ierr) > > end subroutine > > The problem I am having is that this segfaults when the residual routine is called. Valgrind shows that it is failing in the fortran interface in ftn-custom/zmgfuncf.c, with the message: > > ==24742== Invalid read of size 8 > ==24742== at 0x5B7CBC0: ourresidualfunction (in /home/projects/dependencies/petsc_main/arch-linux-c-opt/lib/libpetsc.so.3.015.0) > ==24742== by 0x5B6D804: PCMGMCycle_Private (in /home/projects/dependencies/petsc_main/arch-linux-c-opt/lib/libpetsc.so.3.015.0) > > ==24742== Process terminating with default action of signal 11 (SIGSEGV) > ==24742== Access not within mapped region at address 0x0 > ==24742== at 0x5B7CBC0: ourresidualfunction (in /home/sdargavi/projects/dependencies/petsc_main/arch-linux-c-opt/lib/libpetsc.so.3.015.0) > ==24742== by 0x5B6D804: PCMGMCycle_Private (in /home/sdargavi/projects/dependencies/petsc_main/arch-linux-c-opt/lib/libpetsc.so.3.015.0) > > I'm guessing this is because the fortran_func_pointers isn't pointing to the mg_residual routine, but I am not sure why. I noticed that in the C code of PCMGSetResidual that it calls MatDestroy on the A matrix in mg_levels and replaces it with the mat passed in: > > if (mat) PetscObjectReference((PetscObject)mat); > MatDestroy(&mglevels[l]->A); > mglevels[l]->A = mat > > so I modified my code to call PCMGSetResidual either before the operators are set, or after but passing in an extra copy, but this doesn't seem to help. > > I'm guessing I'm doing something silly, but just wondering if anyone had any ideas? Thanks for your help > Steven From bsmith at petsc.dev Fri Nov 4 12:38:10 2022 From: bsmith at petsc.dev (Barry Smith) Date: Fri, 4 Nov 2022 13:38:10 -0400 Subject: [petsc-users] PCMGSetResidual and fortran In-Reply-To: References: Message-ID: <7098E534-6376-46E4-B8F8-1E00F3EA7441@petsc.dev> Steven, Could you please send your test code. It is possible there is a bug in our Fortran interface since we do not test it for this functionality. Barry > On Nov 4, 2022, at 10:50 AM, Steven Dargaville wrote: > > Hi all > > I have a quick question regarding the use of PCMGSetResidual within fortran code. I'm calling PCMGSetResidual from within fortran: > > call PCMGSetResidual(pc_mg, petsc_level, mg_residual, coarse_matrix ierr) > > and just for testing purposes I've written a trivial residual evaluation routine: > > subroutine mg_residual(mat, b, x, r, ierr) > !!< Compute the residual > > ! ~~~~~~ > type(tMat) :: mat > type(tVec) :: b, x, r > PetscErrorCode :: ierr > ! ~~~~~~ > > print *, "inside residual evaluation" > call MatResidual(mat, b, x, r, ierr) > > end subroutine > > The problem I am having is that this segfaults when the residual routine is called. Valgrind shows that it is failing in the fortran interface in ftn-custom/zmgfuncf.c, with the message: > > ==24742== Invalid read of size 8 > ==24742== at 0x5B7CBC0: ourresidualfunction (in /home/projects/dependencies/petsc_main/arch-linux-c-opt/lib/libpetsc.so.3.015.0) > ==24742== by 0x5B6D804: PCMGMCycle_Private (in /home/projects/dependencies/petsc_main/arch-linux-c-opt/lib/libpetsc.so.3.015.0) > > ==24742== Process terminating with default action of signal 11 (SIGSEGV) > ==24742== Access not within mapped region at address 0x0 > ==24742== at 0x5B7CBC0: ourresidualfunction (in /home/sdargavi/projects/dependencies/petsc_main/arch-linux-c-opt/lib/libpetsc.so.3.015.0) > ==24742== by 0x5B6D804: PCMGMCycle_Private (in /home/sdargavi/projects/dependencies/petsc_main/arch-linux-c-opt/lib/libpetsc.so.3.015.0) > > I'm guessing this is because the fortran_func_pointers isn't pointing to the mg_residual routine, but I am not sure why. I noticed that in the C code of PCMGSetResidual that it calls MatDestroy on the A matrix in mg_levels and replaces it with the mat passed in: > > if (mat) PetscObjectReference((PetscObject)mat); > MatDestroy(&mglevels[l]->A); > mglevels[l]->A = mat > > so I modified my code to call PCMGSetResidual either before the operators are set, or after but passing in an extra copy, but this doesn't seem to help. > > I'm guessing I'm doing something silly, but just wondering if anyone had any ideas? Thanks for your help > Steven From balay at mcs.anl.gov Fri Nov 4 12:47:59 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 4 Nov 2022 12:47:59 -0500 (CDT) Subject: [petsc-users] Installing Triangle for PETSc by not using --download In-Reply-To: References: Message-ID: <98275576-90c4-165a-af8d-108f11abdb47@mcs.anl.gov> Can you try the --with-packages-download-dir option? It tells you the URL to download - and then PETSc configure does the install. Satish -------- balay at p1 /home/balay/petsc (release =) $ ./configure --with-packages-download-dir=$HOME/tmp --download-triangle Download the following packages to /home/balay/tmp triangle ['git://https://bitbucket.org/petsc/pkg-triangle', 'https://bitbucket.org/petsc/pkg-triangle/get/v1.3-p2.tar.gz'] Then run the script again balay at p1 /home/balay/petsc (release =) $ pushd $HOME/tmp ~/tmp ~/petsc balay at p1 /home/balay/tmp $ git clone -q https://bitbucket.org/petsc/pkg-triangle balay at p1 /home/balay/tmp $ popd ~/petsc balay at p1 /home/balay/petsc (release =) $ ./configure --with-packages-download-dir=$HOME/tmp --download-triangle ============================================================================================= Configuring PETSc to compile on your system ============================================================================================= ============================================================================================= Running /home/balay/soft/sowing-1.1.26-p1/bin/bfort to generate Fortran stubs ============================================================================================= ============================================================================================= Trying to download /home/balay/tmp/pkg-triangle for TRIANGLE ============================================================================================= ============================================================================================= Compiling Triangle; this may take several minutes ============================================================================================= ============================================================================================= Installing Triangle; this may take several minutes ============================================================================================= TESTING: checklsame from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:691) ... .. On Fri, 4 Nov 2022, Mike Michell wrote: > Hi, > > I need to install PETSc and its dependencies to a linux system, which does > not allow git clone from online. Thus I need to install all the > dependencies by having their source files. First I downloaded and installed > all the dependencies by relying on PETSc on my local linux (which means I > used --download=triangle), then tar all of them, and brought them to the > cluster. > > There is an issue with Triangle. I can do make (or cmake using the > CMakeList file that Triangle 1.6 provides, although PETSc --download > provides Triangle 1.3) and can get "libtriangle.a". But during the PETSc > configure step, it fails with the error message below: > > {$Triangle_Home}/build/libtriangle.a(triangle.c.o): In function `poolinit': > triangle.c:(.text+0x15aa): undefined reference to `PetscTrMalloc' > {$Triangle_Home}/build/libtriangle.a(triangle.c.o): In function > `pooldeinit': > triangle.c:(.text+0x1708): undefined reference to `PetscTrFree' > ... > > It seems that I need to let Triangle know that it will be used by PETSc to > enable those functions when I install Triangle. Is this correct > understanding? If so, how can I configure, install, and link Triangle with > PETSc? > > I have seen some PETSc related commands in configure.py in the root > directory of Triangle downloaded by PETSc, but it is not clear how this > python script is related or can be used by user like me. > > Thanks, > Mike > From mi.mike1021 at gmail.com Fri Nov 4 14:24:17 2022 From: mi.mike1021 at gmail.com (Mike Michell) Date: Fri, 4 Nov 2022 14:24:17 -0500 Subject: [petsc-users] Installing Triangle for PETSc by not using --download In-Reply-To: <98275576-90c4-165a-af8d-108f11abdb47@mcs.anl.gov> References: <98275576-90c4-165a-af8d-108f11abdb47@mcs.anl.gov> Message-ID: Thank you very much. It seems that --download-package={downloaded_dir} is the right way to let PETSc know the source files and let it configures & installs everything as it wants. If I try --with-packages-download-dir, PETSc again tries to connect and get the external packages from online, which I cannot do. Thanks, > Can you try the --with-packages-download-dir option? > > It tells you the URL to download - and then PETSc configure does the > install. > > Satish > > -------- > > balay at p1 /home/balay/petsc (release =) > $ ./configure --with-packages-download-dir=$HOME/tmp --download-triangle > Download the following packages to /home/balay/tmp > > triangle ['git://https://bitbucket.org/petsc/pkg-triangle', ' > https://bitbucket.org/petsc/pkg-triangle/get/v1.3-p2.tar.gz'] > > Then run the script again > > balay at p1 /home/balay/petsc (release =) > $ pushd $HOME/tmp > ~/tmp ~/petsc > balay at p1 /home/balay/tmp > $ git clone -q https://bitbucket.org/petsc/pkg-triangle > balay at p1 /home/balay/tmp > $ popd > ~/petsc > balay at p1 /home/balay/petsc (release =) > $ ./configure --with-packages-download-dir=$HOME/tmp --download-triangle > > ============================================================================================= > Configuring PETSc to compile on your system > > ============================================================================================= > > ============================================================================================= > Running /home/balay/soft/sowing-1.1.26-p1/bin/bfort to generate > Fortran stubs > > ============================================================================================= > > ============================================================================================= > Trying to download /home/balay/tmp/pkg-triangle for > TRIANGLE > > ============================================================================================= > > ============================================================================================= > Compiling Triangle; this may take several minutes > > ============================================================================================= > > ============================================================================================= > Installing Triangle; this may take several minutes > > ============================================================================================= > TESTING: checklsame from > config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:691) > > ... > .. > > > On Fri, 4 Nov 2022, Mike Michell wrote: > > > Hi, > > > > I need to install PETSc and its dependencies to a linux system, which > does > > not allow git clone from online. Thus I need to install all the > > dependencies by having their source files. First I downloaded and > installed > > all the dependencies by relying on PETSc on my local linux (which means I > > used --download=triangle), then tar all of them, and brought them to the > > cluster. > > > > There is an issue with Triangle. I can do make (or cmake using the > > CMakeList file that Triangle 1.6 provides, although PETSc --download > > provides Triangle 1.3) and can get "libtriangle.a". But during the PETSc > > configure step, it fails with the error message below: > > > > {$Triangle_Home}/build/libtriangle.a(triangle.c.o): In function > `poolinit': > > triangle.c:(.text+0x15aa): undefined reference to `PetscTrMalloc' > > {$Triangle_Home}/build/libtriangle.a(triangle.c.o): In function > > `pooldeinit': > > triangle.c:(.text+0x1708): undefined reference to `PetscTrFree' > > ... > > > > It seems that I need to let Triangle know that it will be used by PETSc > to > > enable those functions when I install Triangle. Is this correct > > understanding? If so, how can I configure, install, and link Triangle > with > > PETSc? > > > > I have seen some PETSc related commands in configure.py in the root > > directory of Triangle downloaded by PETSc, but it is not clear how this > > python script is related or can be used by user like me. > > > > Thanks, > > Mike > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Fri Nov 4 14:36:55 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 4 Nov 2022 14:36:55 -0500 (CDT) Subject: [petsc-users] Installing Triangle for PETSc by not using --download In-Reply-To: References: <98275576-90c4-165a-af8d-108f11abdb47@mcs.anl.gov> Message-ID: On Fri, 4 Nov 2022, Mike Michell wrote: > Thank you very much. It seems that --download-package={downloaded_dir} is > the right way to let PETSc know the source files and let it configures & > installs everything as it wants. > If I try --with-packages-download-dir, PETSc again tries to connect and get > the external packages from online, which I cannot do. No - it tells you what to download - and where to place it locally. In my example below - I used 'git clone' but you can use rsync or any other mechanism that you are currently using to copy stuff over. And sure --download-package=/local/location/[tarball,git,dir] also works . Satish > > Thanks, > > > > Can you try the --with-packages-download-dir option? > > > > It tells you the URL to download - and then PETSc configure does the > > install. > > > > Satish > > > > -------- > > > > balay at p1 /home/balay/petsc (release =) > > $ ./configure --with-packages-download-dir=$HOME/tmp --download-triangle > > Download the following packages to /home/balay/tmp > > > > triangle ['git://https://bitbucket.org/petsc/pkg-triangle', ' > > https://bitbucket.org/petsc/pkg-triangle/get/v1.3-p2.tar.gz'] > > > > Then run the script again > > > > balay at p1 /home/balay/petsc (release =) > > $ pushd $HOME/tmp > > ~/tmp ~/petsc > > balay at p1 /home/balay/tmp > > $ git clone -q https://bitbucket.org/petsc/pkg-triangle > > balay at p1 /home/balay/tmp > > $ popd > > ~/petsc > > balay at p1 /home/balay/petsc (release =) > > $ ./configure --with-packages-download-dir=$HOME/tmp --download-triangle > > > > ============================================================================================= > > Configuring PETSc to compile on your system > > > > ============================================================================================= > > > > ============================================================================================= > > Running /home/balay/soft/sowing-1.1.26-p1/bin/bfort to generate > > Fortran stubs > > > > ============================================================================================= > > > > ============================================================================================= > > Trying to download /home/balay/tmp/pkg-triangle for > > TRIANGLE > > > > ============================================================================================= > > > > ============================================================================================= > > Compiling Triangle; this may take several minutes > > > > ============================================================================================= > > > > ============================================================================================= > > Installing Triangle; this may take several minutes > > > > ============================================================================================= > > TESTING: checklsame from > > config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:691) > > > > ... > > .. > > > > > > On Fri, 4 Nov 2022, Mike Michell wrote: > > > > > Hi, > > > > > > I need to install PETSc and its dependencies to a linux system, which > > does > > > not allow git clone from online. Thus I need to install all the > > > dependencies by having their source files. First I downloaded and > > installed > > > all the dependencies by relying on PETSc on my local linux (which means I > > > used --download=triangle), then tar all of them, and brought them to the > > > cluster. > > > > > > There is an issue with Triangle. I can do make (or cmake using the > > > CMakeList file that Triangle 1.6 provides, although PETSc --download > > > provides Triangle 1.3) and can get "libtriangle.a". But during the PETSc > > > configure step, it fails with the error message below: > > > > > > {$Triangle_Home}/build/libtriangle.a(triangle.c.o): In function > > `poolinit': > > > triangle.c:(.text+0x15aa): undefined reference to `PetscTrMalloc' > > > {$Triangle_Home}/build/libtriangle.a(triangle.c.o): In function > > > `pooldeinit': > > > triangle.c:(.text+0x1708): undefined reference to `PetscTrFree' > > > ... > > > > > > It seems that I need to let Triangle know that it will be used by PETSc > > to > > > enable those functions when I install Triangle. Is this correct > > > understanding? If so, how can I configure, install, and link Triangle > > with > > > PETSc? > > > > > > I have seen some PETSc related commands in configure.py in the root > > > directory of Triangle downloaded by PETSc, but it is not clear how this > > > python script is related or can be used by user like me. > > > > > > Thanks, > > > Mike > > > > > > > > From edoardo.alinovi at gmail.com Sat Nov 5 04:30:46 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Sat, 5 Nov 2022 10:30:46 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: <166EF709-8936-422D-A9BE-58D0072E718B@petsc.dev> <0D3C529A-3B69-4F2D-8CE9-ED87BB482F2F@petsc.dev> <9119F0A7-F279-45E6-9213-06D9907EC26F@petsc.dev> Message-ID: Matt, Barry, Should I do any particular trick to solve block matrices in ksp? I am doing a silly 3x3 cavity test case and I am struggling to converge using CG+bjacobi. It might be I have an error in the matrix, but just to be sure I am not missing something fundamental in the setup. This is my log that shows how norm residual is far to be happy: *10000 *KSP unpreconditioned resid norm 2.273088479279e+03 true resid norm 2.273088479279e+03 ||r(i)||/||b|| 6.561841227018e+02 Reason = -3 ERROR: KSP has not converged. Simulations stopped. [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: KSPSolve has not converged, reason DIVERGED_ITS Thank you -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre at joliv.et Sat Nov 5 04:38:05 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Sat, 5 Nov 2022 10:38:05 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: References: Message-ID: <03E5A1AA-CF83-491C-92B6-DE88F7BD88A1@joliv.et> > On 5 Nov 2022, at 10:31 AM, Edoardo alinovi wrote: > > ? > Matt, Barry, > > Should I do any particular trick to solve block matrices in ksp? > > I am doing a silly 3x3 cavity test case and I am struggling to converge using CG+bjacobi. This is far from the ideal preconditioner. First, you should check that your assembly is correct and stick to PCLU. Does the KSP report an error in this case? Does the solution ? look ? acceptable? Thanks, Pierre > It might be I have an error in the matrix, but just to be sure I am not missing something fundamental in the setup. > > This is my log that shows how norm residual is far to be happy: > > 10000 KSP unpreconditioned resid norm 2.273088479279e+03 true resid norm 2.273088479279e+03 ||r(i)||/||b|| 6.561841227018e+02 > Reason = -3 > ERROR: KSP has not converged. Simulations stopped. > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: KSPSolve has not converged, reason DIVERGED_ITS > > Thank you > -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Sat Nov 5 04:52:12 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Sat, 5 Nov 2022 10:52:12 +0100 Subject: [petsc-users] On the usage of MatSetValuesBlocked In-Reply-To: <03E5A1AA-CF83-491C-92B6-DE88F7BD88A1@joliv.et> References: <03E5A1AA-CF83-491C-92B6-DE88F7BD88A1@joliv.et> Message-ID: Hello Pierre, Thank you for the suggestion. However, this is one of the cases where the error is done by the guy sitting in front of the screen! My matrix is no more symmetric and I was pretending to solve it with CG! At least the theory works! ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From mhyaqteen at sju.ac.kr Sun Nov 6 03:29:14 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Sun, 6 Nov 2022 09:29:14 +0000 Subject: [petsc-users] PETSc Windows Installation Message-ID: Dear Sir/Madam, I am installing PETSc on windows but it keeps giving me unexpected errors. I want to use it on MS Visual Studio or Codeblocks. When I use the command on your webpage (./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack), I get the following error message: $ ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack ============================================================================================= Configuring PETSc to compile on your system ============================================================================================= TESTING: checkCCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- C compiler you provided with -with-cc=win32fe cl cannot be found or does not work. Cannot compile/link C with /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/win32fe cl. Kindly look into this problem! Your prompt response will highly be appreciated Thank you Ali -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Nov 6 07:41:38 2022 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 6 Nov 2022 08:41:38 -0500 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: Message-ID: We need to see configure.log to see what is going on. Can you send it? Thanks, Matt On Sun, Nov 6, 2022 at 4:29 AM Mohammad Ali Yaqteen wrote: > Dear Sir/Madam, > > > > I am installing PETSc on windows but it keeps giving me unexpected errors. > I want to use it on MS Visual Studio or Codeblocks. When I use the command > on your webpage (./configure --with-cc='win32fe cl' --with-fc='win32fe > ifort' --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack), I get > the following error message: > > > > $ ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' > --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack > > > ============================================================================================= > > Configuring PETSc to compile on your system > > > ============================================================================================= > > TESTING: checkCCompiler from > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > details): > > > ------------------------------------------------------------------------------- > > C compiler you provided with -with-cc=win32fe cl cannot be found or does > not work. > > Cannot compile/link C with > /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/win32fe cl. > > > > Kindly look into this problem! Your prompt response will highly be > appreciated > > > > Thank you > > Ali > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Sun Nov 6 09:00:15 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Sun, 6 Nov 2022 09:00:15 -0600 (CST) Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: Message-ID: <2d1a6386-1c0c-2c52-4696-40efbd3b8a11@mcs.anl.gov> Likely the compilers are not setup correctly as per instructions. https://petsc.org/release/install/windows/#installation-with-microsoft-intel-windows-compilers And if you do not have a specific windows need - and only need IDE - perhaps a WSL2 (aka linux) install with VSCode(linux) might be the way to go. Satish On Sun, 6 Nov 2022, Matthew Knepley wrote: > We need to see configure.log to see what is going on. Can you send it? > > Thanks, > > Matt > > On Sun, Nov 6, 2022 at 4:29 AM Mohammad Ali Yaqteen > wrote: > > > Dear Sir/Madam, > > > > > > > > I am installing PETSc on windows but it keeps giving me unexpected errors. > > I want to use it on MS Visual Studio or Codeblocks. When I use the command > > on your webpage (./configure --with-cc='win32fe cl' --with-fc='win32fe > > ifort' --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack), I get > > the following error message: > > > > > > > > $ ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' > > --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack > > > > > > ============================================================================================= > > > > Configuring PETSc to compile on your system > > > > > > ============================================================================================= > > > > TESTING: checkCCompiler from > > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > details): > > > > > > ------------------------------------------------------------------------------- > > > > C compiler you provided with -with-cc=win32fe cl cannot be found or does > > not work. > > > > Cannot compile/link C with > > /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/win32fe cl. > > > > > > > > Kindly look into this problem! Your prompt response will highly be > > appreciated > > > > > > > > Thank you > > > > Ali > > > > > From alexlindsay239 at gmail.com Sun Nov 6 16:31:09 2022 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Sun, 6 Nov 2022 14:31:09 -0800 Subject: [petsc-users] Determining maximum number of columns in sparse matrix Message-ID: We sometimes overallocate our sparsity pattern. Matrix assembly will squeeze out allocations that we never added into/set. Is there a convenient way to determine the size of the densest row post-assembly? I know that we could iterate over rows and call `MatGetRow` and figure it out that way. But I'm wondering if there is a better way? Alex -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Nov 6 16:35:02 2022 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 6 Nov 2022 17:35:02 -0500 Subject: [petsc-users] Determining maximum number of columns in sparse matrix In-Reply-To: References: Message-ID: On Sun, Nov 6, 2022 at 5:31 PM Alexander Lindsay wrote: > We sometimes overallocate our sparsity pattern. Matrix assembly will > squeeze out allocations that we never added into/set. Is there a convenient > way to determine the size of the densest row post-assembly? I know that we > could iterate over rows and call `MatGetRow` and figure it out that way. > But I'm wondering if there is a better way? > You could use https://petsc.org/main/docs/manualpages/Mat/MatGetRowIJ/ which gives all the row lengths at once. Thanks, Matt > Alex > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Sun Nov 6 17:30:58 2022 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Sun, 6 Nov 2022 15:30:58 -0800 Subject: [petsc-users] Determining maximum number of columns in sparse matrix In-Reply-To: References: Message-ID: This is great. Thanks Matt! On Sun, Nov 6, 2022 at 2:35 PM Matthew Knepley wrote: > On Sun, Nov 6, 2022 at 5:31 PM Alexander Lindsay > wrote: > >> We sometimes overallocate our sparsity pattern. Matrix assembly will >> squeeze out allocations that we never added into/set. Is there a convenient >> way to determine the size of the densest row post-assembly? I know that we >> could iterate over rows and call `MatGetRow` and figure it out that way. >> But I'm wondering if there is a better way? >> > > You could use https://petsc.org/main/docs/manualpages/Mat/MatGetRowIJ/ > which gives all the row lengths at once. > > Thanks, > > Matt > > >> Alex >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mhyaqteen at sju.ac.kr Sun Nov 6 21:01:34 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Mon, 7 Nov 2022 03:01:34 +0000 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: Message-ID: I used the following command to install PETSc again: (I have also attached the configure.log file with this email) ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 ?download-f2blaslapack ?download-scalapack ?download-mumps It runs successfully and then I make all and make a check after that. It gives me the following: $ make PETSC_DIR=/home/SEJONG/petsc-3.18.1 PETSC_ARCH=arch-mswin-c-debug check Running check examples to verify correct installation Using PETSC_DIR=/home/SEJONG/petsc-3.18.1 and PETSC_ARCH=arch-mswin-c-debug Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process See https://petsc.org/release/faq/ -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- [DESKTOP-R1C768B:29360] [[36285,0],0] unable to open debugger attach fifo -------------------------------------------------------------------------- mpiexec detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[36285,1],0] Exit code: 127 -------------------------------------------------------------------------- Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes See https://petsc.org/release/faq/ -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- [DESKTOP-R1C768B:29383] [[36298,0],0] unable to open debugger attach fifo 1,9c1,5 < lid velocity = 0.0625, prandtl # = 1., grashof # = 1. < 0 SNES Function norm 0.239155 < 0 KSP Residual norm 0.235858 < 1 KSP Residual norm < 1.e-11 < 1 SNES Function norm 6.81968e-05 < 0 KSP Residual norm 2.30906e-05 < 1 KSP Residual norm < 1.e-11 < 2 SNES Function norm < 1.e-11 < Number of SNES iterations = 2 --- > -------------------------------------------------------------------------- > Primary job terminated normally, but 1 process returned > a non-zero exit code. Per user-direction, the job has been aborted. > -------------------------------------------------------------------------- > [DESKTOP-R1C768B:29409] [[36332,0],0] unable to open debugger attach fifo /home/SEJONG/petsc-3.18.1/src/snes/tutorials Possible problem with ex19 running with mumps, diffs above ========================================= Possible error running Fortran example src/snes/tutorials/ex5f with 1 MPI process See https://petsc.org/release/faq/ -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- [DESKTOP-R1C768B:29494] [[35899,0],0] unable to open debugger attach fifo -------------------------------------------------------------------------- mpiexec detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[35899,1],0] Exit code: 127 -------------------------------------------------------------------------- Completed test examples Error while running make check make[1]: *** [makefile:149: check] Error 1 make: *** [GNUmakefile:17: check] Error 2 From: Matthew Knepley Sent: Sunday, November 6, 2022 10:42 PM To: Mohammad Ali Yaqteen Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] PETSc Windows Installation We need to see configure.log to see what is going on. Can you send it? Thanks, Matt On Sun, Nov 6, 2022 at 4:29 AM Mohammad Ali Yaqteen > wrote: Dear Sir/Madam, I am installing PETSc on windows but it keeps giving me unexpected errors. I want to use it on MS Visual Studio or Codeblocks. When I use the command on your webpage (./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack), I get the following error message: $ ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack ============================================================================================= Configuring PETSc to compile on your system ============================================================================================= TESTING: checkCCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- C compiler you provided with -with-cc=win32fe cl cannot be found or does not work. Cannot compile/link C with /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/win32fe cl. Kindly look into this problem! Your prompt response will highly be appreciated Thank you Ali -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 3251725 bytes Desc: configure.log URL: From mhyaqteen at sju.ac.kr Sun Nov 6 23:21:50 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Mon, 7 Nov 2022 05:21:50 +0000 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: <2d1a6386-1c0c-2c52-4696-40efbd3b8a11@mcs.anl.gov> References: <2d1a6386-1c0c-2c52-4696-40efbd3b8a11@mcs.anl.gov> Message-ID: I have written backend code for a software company. If WSL2 and VSCode(Linux) can be called through a command line and executed at the backend, then it will be great. But if I have to install WSL2 and other required things on every other PC that will run that software, then I think I will be at a disadvantage. What do you suggest? Thank you Ali -----Original Message----- From: Satish Balay Sent: Monday, November 7, 2022 12:00 AM To: Matthew Knepley Cc: Mohammad Ali Yaqteen ; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] PETSc Windows Installation Likely the compilers are not setup correctly as per instructions. https://petsc.org/release/install/windows/#installation-with-microsoft-intel-windows-compilers And if you do not have a specific windows need - and only need IDE - perhaps a WSL2 (aka linux) install with VSCode(linux) might be the way to go. Satish On Sun, 6 Nov 2022, Matthew Knepley wrote: > We need to see configure.log to see what is going on. Can you send it? > > Thanks, > > Matt > > On Sun, Nov 6, 2022 at 4:29 AM Mohammad Ali Yaqteen > > wrote: > > > Dear Sir/Madam, > > > > > > > > I am installing PETSc on windows but it keeps giving me unexpected errors. > > I want to use it on MS Visual Studio or Codeblocks. When I use the > > command on your webpage (./configure --with-cc='win32fe cl' > > --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-mpi=0 > > --download-fblaslapack), I get the following error message: > > > > > > > > $ ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' > > --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack > > > > > > ==================================================================== > > ========================= > > > > Configuring PETSc to compile on your system > > > > > > ==================================================================== > > ========================= > > > > TESTING: checkCCompiler from > > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)* > > ******************************************************************** > > ********** > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > details): > > > > > > -------------------------------------------------------------------- > > ----------- > > > > C compiler you provided with -with-cc=win32fe cl cannot be found or > > does not work. > > > > Cannot compile/link C with > > /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/win32fe cl. > > > > > > > > Kindly look into this problem! Your prompt response will highly be > > appreciated > > > > > > > > Thank you > > > > Ali > > > > > From knepley at gmail.com Mon Nov 7 04:13:00 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 7 Nov 2022 05:13:00 -0500 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2d1a6386-1c0c-2c52-4696-40efbd3b8a11@mcs.anl.gov> Message-ID: On Mon, Nov 7, 2022 at 12:21 AM Mohammad Ali Yaqteen wrote: > I have written backend code for a software company. If WSL2 and > VSCode(Linux) can be called through a command line and executed at the > backend, then it will be great. But if I have to install WSL2 and other > required things on every other PC that will run that software, then I think > I will be at a disadvantage. What do you suggest? > As long as you do not change the architecture and the compiler libraries are available, you can run the executable. Thanks, Matt > Thank you > Ali > > -----Original Message----- > From: Satish Balay > Sent: Monday, November 7, 2022 12:00 AM > To: Matthew Knepley > Cc: Mohammad Ali Yaqteen ; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] PETSc Windows Installation > > Likely the compilers are not setup correctly as per instructions. > > > https://petsc.org/release/install/windows/#installation-with-microsoft-intel-windows-compilers > > And if you do not have a specific windows need - and only need IDE - > perhaps a WSL2 (aka linux) install with VSCode(linux) might be the way to > go. > > Satish > > On Sun, 6 Nov 2022, Matthew Knepley wrote: > > > We need to see configure.log to see what is going on. Can you send it? > > > > Thanks, > > > > Matt > > > > On Sun, Nov 6, 2022 at 4:29 AM Mohammad Ali Yaqteen > > > > wrote: > > > > > Dear Sir/Madam, > > > > > > > > > > > > I am installing PETSc on windows but it keeps giving me unexpected > errors. > > > I want to use it on MS Visual Studio or Codeblocks. When I use the > > > command on your webpage (./configure --with-cc='win32fe cl' > > > --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-mpi=0 > > > --download-fblaslapack), I get the following error message: > > > > > > > > > > > > $ ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' > > > --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack > > > > > > > > > ==================================================================== > > > ========================= > > > > > > Configuring PETSc to compile on your system > > > > > > > > > ==================================================================== > > > ========================= > > > > > > TESTING: checkCCompiler from > > > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)* > > > ******************************************************************** > > > ********** > > > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log > for > > > details): > > > > > > > > > -------------------------------------------------------------------- > > > ----------- > > > > > > C compiler you provided with -with-cc=win32fe cl cannot be found or > > > does not work. > > > > > > Cannot compile/link C with > > > /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/win32fe cl. > > > > > > > > > > > > Kindly look into this problem! Your prompt response will highly be > > > appreciated > > > > > > > > > > > > Thank you > > > > > > Ali > > > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mhyaqteen at sju.ac.kr Mon Nov 7 06:11:40 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Mon, 7 Nov 2022 12:11:40 +0000 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2d1a6386-1c0c-2c52-4696-40efbd3b8a11@mcs.anl.gov> Message-ID: Once I finish writing the code, the .exe file will not change. Can I make an .exe file using WSL2 and VScode? Thanks, Ali From: Matthew Knepley Sent: Monday, November 7, 2022 7:13 PM To: Mohammad Ali Yaqteen Cc: petsc-users Subject: Re: [petsc-users] PETSc Windows Installation On Mon, Nov 7, 2022 at 12:21 AM Mohammad Ali Yaqteen > wrote: I have written backend code for a software company. If WSL2 and VSCode(Linux) can be called through a command line and executed at the backend, then it will be great. But if I have to install WSL2 and other required things on every other PC that will run that software, then I think I will be at a disadvantage. What do you suggest? As long as you do not change the architecture and the compiler libraries are available, you can run the executable. Thanks, Matt Thank you Ali -----Original Message----- From: Satish Balay > Sent: Monday, November 7, 2022 12:00 AM To: Matthew Knepley > Cc: Mohammad Ali Yaqteen >; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] PETSc Windows Installation Likely the compilers are not setup correctly as per instructions. https://petsc.org/release/install/windows/#installation-with-microsoft-intel-windows-compilers And if you do not have a specific windows need - and only need IDE - perhaps a WSL2 (aka linux) install with VSCode(linux) might be the way to go. Satish On Sun, 6 Nov 2022, Matthew Knepley wrote: > We need to see configure.log to see what is going on. Can you send it? > > Thanks, > > Matt > > On Sun, Nov 6, 2022 at 4:29 AM Mohammad Ali Yaqteen > > > wrote: > > > Dear Sir/Madam, > > > > > > > > I am installing PETSc on windows but it keeps giving me unexpected errors. > > I want to use it on MS Visual Studio or Codeblocks. When I use the > > command on your webpage (./configure --with-cc='win32fe cl' > > --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-mpi=0 > > --download-fblaslapack), I get the following error message: > > > > > > > > $ ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' > > --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack > > > > > > ==================================================================== > > ========================= > > > > Configuring PETSc to compile on your system > > > > > > ==================================================================== > > ========================= > > > > TESTING: checkCCompiler from > > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)* > > ******************************************************************** > > ********** > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > details): > > > > > > -------------------------------------------------------------------- > > ----------- > > > > C compiler you provided with -with-cc=win32fe cl cannot be found or > > does not work. > > > > Cannot compile/link C with > > /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/win32fe cl. > > > > > > > > Kindly look into this problem! Your prompt response will highly be > > appreciated > > > > > > > > Thank you > > > > Ali > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Nov 7 06:29:53 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 7 Nov 2022 07:29:53 -0500 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2d1a6386-1c0c-2c52-4696-40efbd3b8a11@mcs.anl.gov> Message-ID: On Mon, Nov 7, 2022 at 7:11 AM Mohammad Ali Yaqteen wrote: > Once I finish writing the code, the .exe file will not change. Can I make > an .exe file using WSL2 and VScode? > If you build in WSL2, it will link to system libraries. You would probably need to run in WSL2 after that. If you are planning on running on native Windows, you likely need to build there. Thanks, Matt > Thanks, > > Ali > > > > *From:* Matthew Knepley > *Sent:* Monday, November 7, 2022 7:13 PM > *To:* Mohammad Ali Yaqteen > *Cc:* petsc-users > *Subject:* Re: [petsc-users] PETSc Windows Installation > > > > On Mon, Nov 7, 2022 at 12:21 AM Mohammad Ali Yaqteen > wrote: > > I have written backend code for a software company. If WSL2 and > VSCode(Linux) can be called through a command line and executed at the > backend, then it will be great. But if I have to install WSL2 and other > required things on every other PC that will run that software, then I think > I will be at a disadvantage. What do you suggest? > > > > As long as you do not change the architecture and the compiler libraries > are available, you can run the executable. > > > > Thanks, > > > > Matt > > > > Thank you > Ali > > -----Original Message----- > From: Satish Balay > Sent: Monday, November 7, 2022 12:00 AM > To: Matthew Knepley > Cc: Mohammad Ali Yaqteen ; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] PETSc Windows Installation > > Likely the compilers are not setup correctly as per instructions. > > > https://petsc.org/release/install/windows/#installation-with-microsoft-intel-windows-compilers > > And if you do not have a specific windows need - and only need IDE - > perhaps a WSL2 (aka linux) install with VSCode(linux) might be the way to > go. > > Satish > > On Sun, 6 Nov 2022, Matthew Knepley wrote: > > > We need to see configure.log to see what is going on. Can you send it? > > > > Thanks, > > > > Matt > > > > On Sun, Nov 6, 2022 at 4:29 AM Mohammad Ali Yaqteen > > > > wrote: > > > > > Dear Sir/Madam, > > > > > > > > > > > > I am installing PETSc on windows but it keeps giving me unexpected > errors. > > > I want to use it on MS Visual Studio or Codeblocks. When I use the > > > command on your webpage (./configure --with-cc='win32fe cl' > > > --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-mpi=0 > > > --download-fblaslapack), I get the following error message: > > > > > > > > > > > > $ ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' > > > --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack > > > > > > > > > ==================================================================== > > > ========================= > > > > > > Configuring PETSc to compile on your system > > > > > > > > > ==================================================================== > > > ========================= > > > > > > TESTING: checkCCompiler from > > > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)* > > > ******************************************************************** > > > ********** > > > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log > for > > > details): > > > > > > > > > -------------------------------------------------------------------- > > > ----------- > > > > > > C compiler you provided with -with-cc=win32fe cl cannot be found or > > > does not work. > > > > > > Cannot compile/link C with > > > /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/win32fe cl. > > > > > > > > > > > > Kindly look into this problem! Your prompt response will highly be > > > appreciated > > > > > > > > > > > > Thank you > > > > > > Ali > > > > > > > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From badi.hamid at gmail.com Mon Nov 7 06:38:40 2022 From: badi.hamid at gmail.com (hamid badi) Date: Mon, 7 Nov 2022 13:38:40 +0100 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2d1a6386-1c0c-2c52-4696-40efbd3b8a11@mcs.anl.gov> Message-ID: You can try gcc/clang cross-compilers, it's a little but tricky, i had to change some petsc codes but it works fine. Le lun. 7 nov. 2022 ? 13:30, Matthew Knepley a ?crit : > On Mon, Nov 7, 2022 at 7:11 AM Mohammad Ali Yaqteen > wrote: > >> Once I finish writing the code, the .exe file will not change. Can I make >> an .exe file using WSL2 and VScode? >> > > If you build in WSL2, it will link to system libraries. You would probably > need to run in WSL2 after that. If you are planning > on running on native Windows, you likely need to build there. > > Thanks, > > Matt > > >> Thanks, >> >> Ali >> >> >> >> *From:* Matthew Knepley >> *Sent:* Monday, November 7, 2022 7:13 PM >> *To:* Mohammad Ali Yaqteen >> *Cc:* petsc-users >> *Subject:* Re: [petsc-users] PETSc Windows Installation >> >> >> >> On Mon, Nov 7, 2022 at 12:21 AM Mohammad Ali Yaqteen >> wrote: >> >> I have written backend code for a software company. If WSL2 and >> VSCode(Linux) can be called through a command line and executed at the >> backend, then it will be great. But if I have to install WSL2 and other >> required things on every other PC that will run that software, then I think >> I will be at a disadvantage. What do you suggest? >> >> >> >> As long as you do not change the architecture and the compiler libraries >> are available, you can run the executable. >> >> >> >> Thanks, >> >> >> >> Matt >> >> >> >> Thank you >> Ali >> >> -----Original Message----- >> From: Satish Balay >> Sent: Monday, November 7, 2022 12:00 AM >> To: Matthew Knepley >> Cc: Mohammad Ali Yaqteen ; petsc-users at mcs.anl.gov >> Subject: Re: [petsc-users] PETSc Windows Installation >> >> Likely the compilers are not setup correctly as per instructions. >> >> >> https://petsc.org/release/install/windows/#installation-with-microsoft-intel-windows-compilers >> >> And if you do not have a specific windows need - and only need IDE - >> perhaps a WSL2 (aka linux) install with VSCode(linux) might be the way to >> go. >> >> Satish >> >> On Sun, 6 Nov 2022, Matthew Knepley wrote: >> >> > We need to see configure.log to see what is going on. Can you send it? >> > >> > Thanks, >> > >> > Matt >> > >> > On Sun, Nov 6, 2022 at 4:29 AM Mohammad Ali Yaqteen >> > >> > wrote: >> > >> > > Dear Sir/Madam, >> > > >> > > >> > > >> > > I am installing PETSc on windows but it keeps giving me unexpected >> errors. >> > > I want to use it on MS Visual Studio or Codeblocks. When I use the >> > > command on your webpage (./configure --with-cc='win32fe cl' >> > > --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-mpi=0 >> > > --download-fblaslapack), I get the following error message: >> > > >> > > >> > > >> > > $ ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' >> > > --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack >> > > >> > > >> > > ==================================================================== >> > > ========================= >> > > >> > > Configuring PETSc to compile on your system >> > > >> > > >> > > ==================================================================== >> > > ========================= >> > > >> > > TESTING: checkCCompiler from >> > > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)* >> > > ******************************************************************** >> > > ********** >> > > >> > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log >> for >> > > details): >> > > >> > > >> > > -------------------------------------------------------------------- >> > > ----------- >> > > >> > > C compiler you provided with -with-cc=win32fe cl cannot be found or >> > > does not work. >> > > >> > > Cannot compile/link C with >> > > /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/win32fe cl. >> > > >> > > >> > > >> > > Kindly look into this problem! Your prompt response will highly be >> > > appreciated >> > > >> > > >> > > >> > > Thank you >> > > >> > > Ali >> > > >> > >> > >> > >> >> >> >> >> -- >> >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre at joliv.et Mon Nov 7 06:50:13 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Mon, 7 Nov 2022 13:50:13 +0100 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2d1a6386-1c0c-2c52-4696-40efbd3b8a11@mcs.anl.gov> Message-ID: Or you can use MinGW, it?s not tricky, you don?t need to change any PETSc code, and you can ship .exe for either x86_64 (mingw-w64-x86_64-gcc) or ARM (mingw-w64-clang-aarch64-clang, without MPI). Thanks, Pierre > On 7 Nov 2022, at 1:38 PM, hamid badi wrote: > > You can try gcc/clang cross-compilers, it's a little but tricky, i had to change some petsc codes but it works fine. > > Le lun. 7 nov. 2022 ? 13:30, Matthew Knepley > a ?crit : >> On Mon, Nov 7, 2022 at 7:11 AM Mohammad Ali Yaqteen > wrote: >>> Once I finish writing the code, the .exe file will not change. Can I make an .exe file using WSL2 and VScode? >>> >> >> If you build in WSL2, it will link to system libraries. You would probably need to run in WSL2 after that. If you are planning >> on running on native Windows, you likely need to build there. >> >> Thanks, >> >> Matt >> >>> Thanks, >>> >>> Ali >>> >>> >>> >>> From: Matthew Knepley > >>> Sent: Monday, November 7, 2022 7:13 PM >>> To: Mohammad Ali Yaqteen > >>> Cc: petsc-users > >>> Subject: Re: [petsc-users] PETSc Windows Installation >>> >>> >>> >>> On Mon, Nov 7, 2022 at 12:21 AM Mohammad Ali Yaqteen > wrote: >>> >>> I have written backend code for a software company. If WSL2 and VSCode(Linux) can be called through a command line and executed at the backend, then it will be great. But if I have to install WSL2 and other required things on every other PC that will run that software, then I think I will be at a disadvantage. What do you suggest? >>> >>> >>> >>> As long as you do not change the architecture and the compiler libraries are available, you can run the executable. >>> >>> >>> >>> Thanks, >>> >>> >>> >>> Matt >>> >>> >>> >>> Thank you >>> Ali >>> >>> -----Original Message----- >>> From: Satish Balay > >>> Sent: Monday, November 7, 2022 12:00 AM >>> To: Matthew Knepley > >>> Cc: Mohammad Ali Yaqteen >; petsc-users at mcs.anl.gov >>> Subject: Re: [petsc-users] PETSc Windows Installation >>> >>> Likely the compilers are not setup correctly as per instructions. >>> >>> https://petsc.org/release/install/windows/#installation-with-microsoft-intel-windows-compilers >>> >>> And if you do not have a specific windows need - and only need IDE - perhaps a WSL2 (aka linux) install with VSCode(linux) might be the way to go. >>> >>> Satish >>> >>> On Sun, 6 Nov 2022, Matthew Knepley wrote: >>> >>> > We need to see configure.log to see what is going on. Can you send it? >>> > >>> > Thanks, >>> > >>> > Matt >>> > >>> > On Sun, Nov 6, 2022 at 4:29 AM Mohammad Ali Yaqteen >>> > > >>> > wrote: >>> > >>> > > Dear Sir/Madam, >>> > > >>> > > >>> > > >>> > > I am installing PETSc on windows but it keeps giving me unexpected errors. >>> > > I want to use it on MS Visual Studio or Codeblocks. When I use the >>> > > command on your webpage (./configure --with-cc='win32fe cl' >>> > > --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-mpi=0 >>> > > --download-fblaslapack), I get the following error message: >>> > > >>> > > >>> > > >>> > > $ ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' >>> > > --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack >>> > > >>> > > >>> > > ==================================================================== >>> > > ========================= >>> > > >>> > > Configuring PETSc to compile on your system >>> > > >>> > > >>> > > ==================================================================== >>> > > ========================= >>> > > >>> > > TESTING: checkCCompiler from >>> > > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)* >>> > > ******************************************************************** >>> > > ********** >>> > > >>> > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for >>> > > details): >>> > > >>> > > >>> > > -------------------------------------------------------------------- >>> > > ----------- >>> > > >>> > > C compiler you provided with -with-cc=win32fe cl cannot be found or >>> > > does not work. >>> > > >>> > > Cannot compile/link C with >>> > > /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/win32fe cl. >>> > > >>> > > >>> > > >>> > > Kindly look into this problem! Your prompt response will highly be >>> > > appreciated >>> > > >>> > > >>> > > >>> > > Thank you >>> > > >>> > > Ali >>> > > >>> > >>> > >>> > >>> >>> >>> >>> >>> >>> -- >>> >>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >>> -- Norbert Wiener >>> >>> >>> >>> https://www.cse.buffalo.edu/~knepley/ >> >> -- >> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre at joliv.et Mon Nov 7 07:06:11 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Mon, 7 Nov 2022 14:06:11 +0100 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2d1a6386-1c0c-2c52-4696-40efbd3b8a11@mcs.anl.gov> Message-ID: Please, keep the list in copy. You can get MSYS2 from https://www.msys2.org/ Then install the following packages: https://github.com/FreeFem/FreeFem-sources/tree/master/etc/jenkins/deployRelease#windows-system Also install MS-MPI: https://www.microsoft.com/en-us/download/details.aspx?id=100593 Configure and compile PETSc under a MSYS2 MinGW x64 shell. Compile your code, and copy the binary. Notice in my screenshot that there are two shells, the MinGW one for building PETSc. The Microsoft (native one) for launching the binary. Thanks, Pierre ? > On 7 Nov 2022, at 1:53 PM, Mohammad Ali Yaqteen wrote: > > Is there a guide for it? That would be very useful! Because I have been trying a lot of things but every now and then there is a little step that is either outdated or can?t run! > > Your help will be highly appreciated > > Thanks > Ali > > From: Pierre Jolivet > Sent: Monday, November 7, 2022 9:50 PM > To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: Re: [petsc-users] PETSc Windows Installation > > Or you can use MinGW, it?s not tricky, you don?t need to change any PETSc code, and you can ship .exe for either x86_64 (mingw-w64-x86_64-gcc) or ARM (mingw-w64-clang-aarch64-clang, without MPI). > > Thanks, > Pierre > > > On 7 Nov 2022, at 1:38 PM, hamid badi > wrote: > > You can try gcc/clang cross-compilers, it's a little but tricky, i had to change some petsc codes but it works fine. > > Le lun. 7 nov. 2022 ? 13:30, Matthew Knepley > a ?crit : > On Mon, Nov 7, 2022 at 7:11 AM Mohammad Ali Yaqteen > wrote: > Once I finish writing the code, the .exe file will not change. Can I make an .exe file using WSL2 and VScode? > > If you build in WSL2, it will link to system libraries. You would probably need to run in WSL2 after that. If you are planning > on running on native Windows, you likely need to build there. > > Thanks, > > Matt > > Thanks, > Ali > > From: Matthew Knepley > > Sent: Monday, November 7, 2022 7:13 PM > To: Mohammad Ali Yaqteen > > Cc: petsc-users > > Subject: Re: [petsc-users] PETSc Windows Installation > > On Mon, Nov 7, 2022 at 12:21 AM Mohammad Ali Yaqteen > wrote: > I have written backend code for a software company. If WSL2 and VSCode(Linux) can be called through a command line and executed at the backend, then it will be great. But if I have to install WSL2 and other required things on every other PC that will run that software, then I think I will be at a disadvantage. What do you suggest? > > As long as you do not change the architecture and the compiler libraries are available, you can run the executable. > > Thanks, > > Matt > > Thank you > Ali > > -----Original Message----- > From: Satish Balay > > Sent: Monday, November 7, 2022 12:00 AM > To: Matthew Knepley > > Cc: Mohammad Ali Yaqteen >; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] PETSc Windows Installation > > Likely the compilers are not setup correctly as per instructions. > > https://petsc.org/release/install/windows/#installation-with-microsoft-intel-windows-compilers > > And if you do not have a specific windows need - and only need IDE - perhaps a WSL2 (aka linux) install with VSCode(linux) might be the way to go. > > Satish > > On Sun, 6 Nov 2022, Matthew Knepley wrote: > > > We need to see configure.log to see what is going on. Can you send it? > > > > Thanks, > > > > Matt > > > > On Sun, Nov 6, 2022 at 4:29 AM Mohammad Ali Yaqteen > > > > > wrote: > > > > > Dear Sir/Madam, > > > > > > > > > > > > I am installing PETSc on windows but it keeps giving me unexpected errors. > > > I want to use it on MS Visual Studio or Codeblocks. When I use the > > > command on your webpage (./configure --with-cc='win32fe cl' > > > --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-mpi=0 > > > --download-fblaslapack), I get the following error message: > > > > > > > > > > > > $ ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' > > > --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack > > > > > > > > > ==================================================================== > > > ========================= > > > > > > Configuring PETSc to compile on your system > > > > > > > > > ==================================================================== > > > ========================= > > > > > > TESTING: checkCCompiler from > > > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)* > > > ******************************************************************** > > > ********** > > > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > > details): > > > > > > > > > -------------------------------------------------------------------- > > > ----------- > > > > > > C compiler you provided with -with-cc=win32fe cl cannot be found or > > > does not work. > > > > > > Cannot compile/link C with > > > /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/win32fe cl. > > > > > > > > > > > > Kindly look into this problem! Your prompt response will highly be > > > appreciated > > > > > > > > > > > > Thank you > > > > > > Ali > > > > > > > > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Screenshot 2022-11-07 at 2.03.59 PM.png Type: image/png Size: 531601 bytes Desc: not available URL: From mhyaqteen at sju.ac.kr Mon Nov 7 07:29:16 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Mon, 7 Nov 2022 13:29:16 +0000 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2d1a6386-1c0c-2c52-4696-40efbd3b8a11@mcs.anl.gov> Message-ID: Do I have to follow all the steps in the first link as it says the following in step 6: $ pacman -S mingw-w64-ucrt-x86_64-gcc resolving dependencies... looking for conflicting packages... Packages (15) mingw-w64-ucrt-x86_64-binutils-2.39-2 mingw-w64-ucrt-x86_64-crt-git-10.0.0.r68.g6eb571448-1 mingw-w64-ucrt-x86_64-gcc-libs-12.2.0-1 mingw-w64-ucrt-x86_64-gmp-6.2.1-3 mingw-w64-ucrt-x86_64-headers-git-10.0.0.r68.g6eb571448-1 mingw-w64-ucrt-x86_64-isl-0.25-1 mingw-w64-ucrt-x86_64-libiconv-1.17-1 mingw-w64-ucrt-x86_64-libwinpthread-git-10.0.0.r68.g6eb571448-1 mingw-w64-ucrt-x86_64-mpc-1.2.1-1 mingw-w64-ucrt-x86_64-mpfr-4.1.0.p13-1 mingw-w64-ucrt-x86_64-windows-default-manifest-6.4-4 mingw-w64-ucrt-x86_64-winpthreads-git-10.0.0.r68.g6eb571448-1 mingw-w64-ucrt-x86_64-zlib-1.2.12-1 mingw-w64-ucrt-x86_64-zstd-1.5.2-2 mingw-w64-ucrt-x86_64-gcc-12.2.0-1 Total Installed Size: 397.59 MiB :: Proceed with installation? [Y/n] [... downloading and installation continues ...] Thanks Ali From: Pierre Jolivet Sent: Monday, November 7, 2022 10:06 PM To: Mohammad Ali Yaqteen Cc: petsc-users Subject: Re: [petsc-users] PETSc Windows Installation Please, keep the list in copy. You can get MSYS2 from https://www.msys2.org/ Then install the following packages: https://github.com/FreeFem/FreeFem-sources/tree/master/etc/jenkins/deployRelease#windows-system Also install MS-MPI: https://www.microsoft.com/en-us/download/details.aspx?id=100593 Configure and compile PETSc under a MSYS2 MinGW x64 shell. Compile your code, and copy the binary. Notice in my screenshot that there are two shells, the MinGW one for building PETSc. The Microsoft (native one) for launching the binary. Thanks, Pierre [cid:image001.png at 01D8F2F8.5D863F80] On 7 Nov 2022, at 1:53 PM, Mohammad Ali Yaqteen > wrote: Is there a guide for it? That would be very useful! Because I have been trying a lot of things but every now and then there is a little step that is either outdated or can?t run! Your help will be highly appreciated Thanks Ali From: Pierre Jolivet > Sent: Monday, November 7, 2022 9:50 PM To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: Re: [petsc-users] PETSc Windows Installation Or you can use MinGW, it?s not tricky, you don?t need to change any PETSc code, and you can ship .exe for either x86_64 (mingw-w64-x86_64-gcc) or ARM (mingw-w64-clang-aarch64-clang, without MPI). Thanks, Pierre On 7 Nov 2022, at 1:38 PM, hamid badi > wrote: You can try gcc/clang cross-compilers, it's a little but tricky, i had to change some petsc codes but it works fine. Le lun. 7 nov. 2022 ? 13:30, Matthew Knepley > a ?crit : On Mon, Nov 7, 2022 at 7:11 AM Mohammad Ali Yaqteen > wrote: Once I finish writing the code, the .exe file will not change. Can I make an .exe file using WSL2 and VScode? If you build in WSL2, it will link to system libraries. You would probably need to run in WSL2 after that. If you are planning on running on native Windows, you likely need to build there. Thanks, Matt Thanks, Ali From: Matthew Knepley > Sent: Monday, November 7, 2022 7:13 PM To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: Re: [petsc-users] PETSc Windows Installation On Mon, Nov 7, 2022 at 12:21 AM Mohammad Ali Yaqteen > wrote: I have written backend code for a software company. If WSL2 and VSCode(Linux) can be called through a command line and executed at the backend, then it will be great. But if I have to install WSL2 and other required things on every other PC that will run that software, then I think I will be at a disadvantage. What do you suggest? As long as you do not change the architecture and the compiler libraries are available, you can run the executable. Thanks, Matt Thank you Ali -----Original Message----- From: Satish Balay > Sent: Monday, November 7, 2022 12:00 AM To: Matthew Knepley > Cc: Mohammad Ali Yaqteen >; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] PETSc Windows Installation Likely the compilers are not setup correctly as per instructions. https://petsc.org/release/install/windows/#installation-with-microsoft-intel-windows-compilers And if you do not have a specific windows need - and only need IDE - perhaps a WSL2 (aka linux) install with VSCode(linux) might be the way to go. Satish On Sun, 6 Nov 2022, Matthew Knepley wrote: > We need to see configure.log to see what is going on. Can you send it? > > Thanks, > > Matt > > On Sun, Nov 6, 2022 at 4:29 AM Mohammad Ali Yaqteen > > > wrote: > > > Dear Sir/Madam, > > > > > > > > I am installing PETSc on windows but it keeps giving me unexpected errors. > > I want to use it on MS Visual Studio or Codeblocks. When I use the > > command on your webpage (./configure --with-cc='win32fe cl' > > --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-mpi=0 > > --download-fblaslapack), I get the following error message: > > > > > > > > $ ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' > > --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack > > > > > > ==================================================================== > > ========================= > > > > Configuring PETSc to compile on your system > > > > > > ==================================================================== > > ========================= > > > > TESTING: checkCCompiler from > > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)* > > ******************************************************************** > > ********** > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > details): > > > > > > -------------------------------------------------------------------- > > ----------- > > > > C compiler you provided with -with-cc=win32fe cl cannot be found or > > does not work. > > > > Cannot compile/link C with > > /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/win32fe cl. > > > > > > > > Kindly look into this problem! Your prompt response will highly be > > appreciated > > > > > > > > Thank you > > > > Ali > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.png Type: image/png Size: 531601 bytes Desc: image001.png URL: From pierre at joliv.et Mon Nov 7 07:38:06 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Mon, 7 Nov 2022 14:38:06 +0100 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: <2d1a6386-1c0c-2c52-4696-40efbd3b8a11@mcs.anl.gov> Message-ID: <992C4377-8755-4EEA-83BC-4C1606C18AE9@joliv.et> You are not running under a MinGW x64 shell, but a MinGW UCRT (Universal C Runtime) x64 shell instead. This may work, but I?ve never tried it myself. Thanks, Pierre > On 7 Nov 2022, at 2:29 PM, Mohammad Ali Yaqteen wrote: > > Do I have to follow all the steps in the first link as it says the following in step 6: > > $ pacman -S mingw-w64-ucrt-x86_64-gcc > resolving dependencies... > looking for conflicting packages... > > Packages (15) mingw-w64-ucrt-x86_64-binutils-2.39-2 > mingw-w64-ucrt-x86_64-crt-git-10.0.0.r68.g6eb571448-1 > mingw-w64-ucrt-x86_64-gcc-libs-12.2.0-1 mingw-w64-ucrt-x86_64-gmp-6.2.1-3 > mingw-w64-ucrt-x86_64-headers-git-10.0.0.r68.g6eb571448-1 > mingw-w64-ucrt-x86_64-isl-0.25-1 mingw-w64-ucrt-x86_64-libiconv-1.17-1 > mingw-w64-ucrt-x86_64-libwinpthread-git-10.0.0.r68.g6eb571448-1 > mingw-w64-ucrt-x86_64-mpc-1.2.1-1 mingw-w64-ucrt-x86_64-mpfr-4.1.0.p13-1 > mingw-w64-ucrt-x86_64-windows-default-manifest-6.4-4 > mingw-w64-ucrt-x86_64-winpthreads-git-10.0.0.r68.g6eb571448-1 > mingw-w64-ucrt-x86_64-zlib-1.2.12-1 mingw-w64-ucrt-x86_64-zstd-1.5.2-2 > mingw-w64-ucrt-x86_64-gcc-12.2.0-1 > > Total Installed Size: 397.59 MiB > > :: Proceed with installation? [Y/n] > [... downloading and installation continues ...] > > Thanks > Ali > > > From: Pierre Jolivet > > Sent: Monday, November 7, 2022 10:06 PM > To: Mohammad Ali Yaqteen > > Cc: petsc-users > > Subject: Re: [petsc-users] PETSc Windows Installation > > Please, keep the list in copy. > You can get MSYS2 from https://www.msys2.org/ > Then install the following packages: https://github.com/FreeFem/FreeFem-sources/tree/master/etc/jenkins/deployRelease#windows-system > Also install MS-MPI: https://www.microsoft.com/en-us/download/details.aspx?id=100593 > Configure and compile PETSc under a MSYS2 MinGW x64 shell. > Compile your code, and copy the binary. > Notice in my screenshot that there are two shells, the MinGW one for building PETSc. > The Microsoft (native one) for launching the binary. > > Thanks, > Pierre > > > > > On 7 Nov 2022, at 1:53 PM, Mohammad Ali Yaqteen > wrote: > > Is there a guide for it? That would be very useful! Because I have been trying a lot of things but every now and then there is a little step that is either outdated or can?t run! > > Your help will be highly appreciated > > Thanks > Ali > > From: Pierre Jolivet > > Sent: Monday, November 7, 2022 9:50 PM > To: Mohammad Ali Yaqteen > > Cc: petsc-users > > Subject: Re: [petsc-users] PETSc Windows Installation > > Or you can use MinGW, it?s not tricky, you don?t need to change any PETSc code, and you can ship .exe for either x86_64 (mingw-w64-x86_64-gcc) or ARM (mingw-w64-clang-aarch64-clang, without MPI). > > Thanks, > Pierre > > > > On 7 Nov 2022, at 1:38 PM, hamid badi > wrote: > > You can try gcc/clang cross-compilers, it's a little but tricky, i had to change some petsc codes but it works fine. > > Le lun. 7 nov. 2022 ? 13:30, Matthew Knepley > a ?crit : > On Mon, Nov 7, 2022 at 7:11 AM Mohammad Ali Yaqteen > wrote: > Once I finish writing the code, the .exe file will not change. Can I make an .exe file using WSL2 and VScode? > > If you build in WSL2, it will link to system libraries. You would probably need to run in WSL2 after that. If you are planning > on running on native Windows, you likely need to build there. > > Thanks, > > Matt > > Thanks, > Ali > > From: Matthew Knepley > > Sent: Monday, November 7, 2022 7:13 PM > To: Mohammad Ali Yaqteen > > Cc: petsc-users > > Subject: Re: [petsc-users] PETSc Windows Installation > > On Mon, Nov 7, 2022 at 12:21 AM Mohammad Ali Yaqteen > wrote: > I have written backend code for a software company. If WSL2 and VSCode(Linux) can be called through a command line and executed at the backend, then it will be great. But if I have to install WSL2 and other required things on every other PC that will run that software, then I think I will be at a disadvantage. What do you suggest? > > As long as you do not change the architecture and the compiler libraries are available, you can run the executable. > > Thanks, > > Matt > > Thank you > Ali > > -----Original Message----- > From: Satish Balay > > Sent: Monday, November 7, 2022 12:00 AM > To: Matthew Knepley > > Cc: Mohammad Ali Yaqteen >; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] PETSc Windows Installation > > Likely the compilers are not setup correctly as per instructions. > > https://petsc.org/release/install/windows/#installation-with-microsoft-intel-windows-compilers > > And if you do not have a specific windows need - and only need IDE - perhaps a WSL2 (aka linux) install with VSCode(linux) might be the way to go. > > Satish > > On Sun, 6 Nov 2022, Matthew Knepley wrote: > > > We need to see configure.log to see what is going on. Can you send it? > > > > Thanks, > > > > Matt > > > > On Sun, Nov 6, 2022 at 4:29 AM Mohammad Ali Yaqteen > > > > > wrote: > > > > > Dear Sir/Madam, > > > > > > > > > > > > I am installing PETSc on windows but it keeps giving me unexpected errors. > > > I want to use it on MS Visual Studio or Codeblocks. When I use the > > > command on your webpage (./configure --with-cc='win32fe cl' > > > --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-mpi=0 > > > --download-fblaslapack), I get the following error message: > > > > > > > > > > > > $ ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' > > > --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack > > > > > > > > > ==================================================================== > > > ========================= > > > > > > Configuring PETSc to compile on your system > > > > > > > > > ==================================================================== > > > ========================= > > > > > > TESTING: checkCCompiler from > > > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)* > > > ******************************************************************** > > > ********** > > > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > > details): > > > > > > > > > -------------------------------------------------------------------- > > > ----------- > > > > > > C compiler you provided with -with-cc=win32fe cl cannot be found or > > > does not work. > > > > > > Cannot compile/link C with > > > /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/win32fe cl. > > > > > > > > > > > > Kindly look into this problem! Your prompt response will highly be > > > appreciated > > > > > > > > > > > > Thank you > > > > > > Ali > > > > > > > > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gongding at cn.cogenda.com Mon Nov 7 08:25:54 2022 From: gongding at cn.cogenda.com (Gong Ding) Date: Mon, 7 Nov 2022 22:25:54 +0800 Subject: [petsc-users] petsc crash with float128 Message-ID: <36ca926b-705c-3635-ee1a-d377456d4b3d@cn.cogenda.com> Dear petsc developer, The petsc linear solver crash with following report ?[1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [1]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and https://petsc.org/release/faq/ [1]PETSC ERROR: ---------------------? Stack Frames ------------------------------------ [1]PETSC ERROR: The line numbers in the error traceback are not always exact. [1]PETSC ERROR: #1 MatLUFactorNumeric_SeqAIJ_Inode() at /usr/local/petsc-3.18.0/src/mat/impls/aij/seq/inode.c:1259 [1]PETSC ERROR: #2 MatLUFactorNumeric() at /usr/local/petsc-3.18.0/src/mat/interface/matrix.c:3200 [1]PETSC ERROR: #3 PCSetUp_LU() at /usr/local/petsc-3.18.0/src/ksp/pc/impls/factor/lu/lu.c:120 [1]PETSC ERROR: #4 PCSetUp() at /usr/local/petsc-3.18.0/src/ksp/pc/interface/precon.c:994 [1]PETSC ERROR: #5 KSPSetUp() at /usr/local/petsc-3.18.0/src/ksp/ksp/interface/itfunc.c:406 [1]PETSC ERROR: #6 PCSetUpOnBlocks_ASM() at /usr/local/petsc-3.18.0/src/ksp/pc/impls/asm/asm.c:417 [1]PETSC ERROR: #7 PCSetUpOnBlocks() at /usr/local/petsc-3.18.0/src/ksp/pc/interface/precon.c:1027 [1]PETSC ERROR: #8 KSPSetUpOnBlocks() at /usr/local/petsc-3.18.0/src/ksp/ksp/interface/itfunc.c:219 [1]PETSC ERROR: #9 KSPSolve_Private() at /usr/local/petsc-3.18.0/src/ksp/ksp/interface/itfunc.c:826 [1]PETSC ERROR: #10 KSPSolve() at /usr/local/petsc-3.18.0/src/ksp/ksp/interface/itfunc.c:1071 [1]PETSC ERROR: #11 SNESSolve_NEWTONLS() at /usr/local/petsc-3.18.0/src/snes/impls/ls/ls.c:210 [1]PETSC ERROR: #12 SNESSolve() at /usr/local/petsc-3.18.0/src/snes/interface/snes.c:4689 The petsc is configured by export PETSC_ARCH=arch-linux2-float128-gcc export MPICH_CC=gcc export MPICH_CXX=g++ export MPICH_F77=gfortran export MPICH_F90=gfortran python3 configure --with-precision=__float128 --download-f2cblaslapack=1 --with-debugging=yes --with-x=0 --with-pic=1 --with-mpi-dir=/usr/local/mpich-3.4.2/ COPTFLAGS="-O3 -mavx2" CXXOPTFLAGS="-O3 -mavx2" FOPTFLAGS="-O3 -mavx2" --force make and run with 8 mpi process. The linear solver is set as ????????? ierr = KSPSetType (ksp, (char*) KSPBCGSL); assert(!ierr); ????????? ierr = PCSetType (pc, (char*) PCASM); assert(!ierr); ????????? ierr = set_petsc_option("-sub_ksp_type","preonly"); assert(!ierr); ????????? ierr = set_petsc_option("-sub_pc_type","lu"); assert(!ierr); ????????? ierr = set_petsc_option("-sub_pc_factor_reuse_fill","1"); assert(!ierr); ????????? ierr = set_petsc_option("-sub_pc_factor_reuse_ordering","1"); assert(!ierr); ????????? ierr = set_petsc_option("-sub_pc_factor_shift_type","NONZERO"); assert(!ierr); if remove set_petsc_option("-sub_pc_factor_reuse_ordering","1"); it seems the crash will not happen. (Not fully tested) Hope this bug can be fixed. Gong Ding From knepley at gmail.com Mon Nov 7 08:35:46 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 7 Nov 2022 09:35:46 -0500 Subject: [petsc-users] petsc crash with float128 In-Reply-To: <36ca926b-705c-3635-ee1a-d377456d4b3d@cn.cogenda.com> References: <36ca926b-705c-3635-ee1a-d377456d4b3d@cn.cogenda.com> Message-ID: On Mon, Nov 7, 2022 at 9:27 AM Gong Ding via petsc-users < petsc-users at mcs.anl.gov> wrote: > Dear petsc developer, > > The petsc linear solver crash with following report > It will make it much easier to debug if you can send your matrix. The easiest way to do this is to give the options -ksp_view_mat binary -ksp_view_rhs binary and send that binary file. Thanks, Matt > [1]PETSC ERROR: > ------------------------------------------------------------------------ > [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > [1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [1]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and > https://petsc.org/release/faq/ > [1]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > [1]PETSC ERROR: The line numbers in the error traceback are not always > exact. > [1]PETSC ERROR: #1 MatLUFactorNumeric_SeqAIJ_Inode() at > /usr/local/petsc-3.18.0/src/mat/impls/aij/seq/inode.c:1259 > [1]PETSC ERROR: #2 MatLUFactorNumeric() at > /usr/local/petsc-3.18.0/src/mat/interface/matrix.c:3200 > [1]PETSC ERROR: #3 PCSetUp_LU() at > /usr/local/petsc-3.18.0/src/ksp/pc/impls/factor/lu/lu.c:120 > [1]PETSC ERROR: #4 PCSetUp() at > /usr/local/petsc-3.18.0/src/ksp/pc/interface/precon.c:994 > [1]PETSC ERROR: #5 KSPSetUp() at > /usr/local/petsc-3.18.0/src/ksp/ksp/interface/itfunc.c:406 > [1]PETSC ERROR: #6 PCSetUpOnBlocks_ASM() at > /usr/local/petsc-3.18.0/src/ksp/pc/impls/asm/asm.c:417 > [1]PETSC ERROR: #7 PCSetUpOnBlocks() at > /usr/local/petsc-3.18.0/src/ksp/pc/interface/precon.c:1027 > [1]PETSC ERROR: #8 KSPSetUpOnBlocks() at > /usr/local/petsc-3.18.0/src/ksp/ksp/interface/itfunc.c:219 > [1]PETSC ERROR: #9 KSPSolve_Private() at > /usr/local/petsc-3.18.0/src/ksp/ksp/interface/itfunc.c:826 > [1]PETSC ERROR: #10 KSPSolve() at > /usr/local/petsc-3.18.0/src/ksp/ksp/interface/itfunc.c:1071 > [1]PETSC ERROR: #11 SNESSolve_NEWTONLS() at > /usr/local/petsc-3.18.0/src/snes/impls/ls/ls.c:210 > [1]PETSC ERROR: #12 SNESSolve() at > /usr/local/petsc-3.18.0/src/snes/interface/snes.c:4689 > > The petsc is configured by > > export PETSC_ARCH=arch-linux2-float128-gcc > export MPICH_CC=gcc > export MPICH_CXX=g++ > export MPICH_F77=gfortran > export MPICH_F90=gfortran > python3 configure --with-precision=__float128 --download-f2cblaslapack=1 > --with-debugging=yes --with-x=0 --with-pic=1 > --with-mpi-dir=/usr/local/mpich-3.4.2/ COPTFLAGS="-O3 -mavx2" > CXXOPTFLAGS="-O3 -mavx2" FOPTFLAGS="-O3 -mavx2" --force > make > > and run with 8 mpi process. > > The linear solver is set as > > ierr = KSPSetType (ksp, (char*) KSPBCGSL); assert(!ierr); > > ierr = PCSetType (pc, (char*) PCASM); assert(!ierr); > ierr = set_petsc_option("-sub_ksp_type","preonly"); > assert(!ierr); > ierr = set_petsc_option("-sub_pc_type","lu"); assert(!ierr); > ierr = set_petsc_option("-sub_pc_factor_reuse_fill","1"); > assert(!ierr); > ierr = set_petsc_option("-sub_pc_factor_reuse_ordering","1"); > assert(!ierr); > ierr = > set_petsc_option("-sub_pc_factor_shift_type","NONZERO"); assert(!ierr); > > > if remove set_petsc_option("-sub_pc_factor_reuse_ordering","1"); > > it seems the crash will not happen. (Not fully tested) > > > Hope this bug can be fixed. > > > Gong Ding > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bourdin at mcmaster.ca Mon Nov 7 09:51:43 2022 From: bourdin at mcmaster.ca (Blaise Bourdin) Date: Mon, 7 Nov 2022 15:51:43 +0000 Subject: [petsc-users] Dof ordering in DMPlexVecGet/Set/RestoreClosure Message-ID: <511602F6-5FC4-4B67-91FB-044A1FD2B561@mcmaster.ca> Hi, How are degree of freedom ordered in calls to DMPlexVec[Set/Get/Restore]Closure? Given a FE mesh and a section for P2-Lagrange elements (i.e. 1 dof per vertex and edge), I was naively assuming that the ?local? vector would contain dof at vertices then edges (in this order), since it matches the section ordering, but it looks like I get edges then vertices? Here is my section (my mesh has 8 cells, 16 edges, and 9 vertices) PetscSection Object: U 1 MPI process type not yet set 1 fields field 0 with 1 components Process 0: ( 0) dim 0 offset 0 ( 1) dim 0 offset 0 ( 2) dim 0 offset 0 ( 3) dim 0 offset 0 ( 4) dim 0 offset 0 ( 5) dim 0 offset 0 ( 6) dim 0 offset 0 ( 7) dim 0 offset 0 ( 8) dim 1 offset 0 ( 9) dim 1 offset 1 ( 10) dim 1 offset 2 ( 11) dim 1 offset 3 ( 12) dim 1 offset 4 ( 13) dim 1 offset 5 ( 14) dim 1 offset 6 ( 15) dim 1 offset 7 ( 16) dim 1 offset 8 ( 17) dim 1 offset 9 ( 18) dim 1 offset 10 ( 19) dim 1 offset 11 ( 20) dim 1 offset 12 ( 21) dim 1 offset 13 ( 22) dim 1 offset 14 ( 23) dim 1 offset 15 ( 24) dim 1 offset 16 ( 25) dim 1 offset 17 ( 26) dim 1 offset 18 ( 27) dim 1 offset 19 ( 28) dim 1 offset 20 ( 29) dim 1 offset 21 ( 30) dim 1 offset 22 ( 31) dim 1 offset 23 ( 32) dim 1 offset 24 Start from the following local vector: Vec Object: U 1 MPI process type: seq 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. Call PetscCallA(DMPlexVecGetClosure(dmU,sectionU,U,0_Ki,UArray,ierr)) and get the following for Array: 10.000000000000000 11.000000000000000 12.000000000000000 1.0000000000000000 2.0000000000000000 3.0000000000000000 Is this ordering predictable and documented somewhere? Is it ordered by stratum? Regards, Blaise ? Canada Research Chair in Mathematical and Computational Aspects of Solid Mechanics (Tier 1) Professor, Department of Mathematics & Statistics Hamilton Hall room 409A, McMaster University 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243 From Bruce.Palmer at pnnl.gov Mon Nov 7 10:18:16 2022 From: Bruce.Palmer at pnnl.gov (Palmer, Bruce J) Date: Mon, 7 Nov 2022 16:18:16 +0000 Subject: [petsc-users] Petsc Documentation Message-ID: Hi, The new Petsc documentation pages don't seem to have a search function. Would it be possible to add one? I was looking around for the documentation on PetscPrintf and couldn't find it, even on the single index of all petsc man pages. Bruce Palmer Computer Scientist Pacific Northwest National Laboratory (509) 375-3899 -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon Nov 7 10:30:47 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 7 Nov 2022 10:30:47 -0600 (CST) Subject: [petsc-users] Petsc Documentation In-Reply-To: References: Message-ID: <82a97890-1fd2-2c1f-8e5a-b86210fd39cc@mcs.anl.gov> I see the search option on the top right of https://petsc.org/release/ [and says ctrl-k is the key binding] Also google search usually gets to the man page. And on the 'single index' page - looks like the visible width is small [compared to whats there] - Ctrl-F is able to find PetscPrintf Satish On Mon, 7 Nov 2022, Palmer, Bruce J via petsc-users wrote: > Hi, > > The new Petsc documentation pages don't seem to have a search function. Would it be possible to add one? I was looking around for the documentation on PetscPrintf and couldn't find it, even on the single index of all petsc man pages. > > Bruce Palmer > Computer Scientist > Pacific Northwest National Laboratory > (509) 375-3899 > > From Bruce.Palmer at pnnl.gov Mon Nov 7 10:43:57 2022 From: Bruce.Palmer at pnnl.gov (Palmer, Bruce J) Date: Mon, 7 Nov 2022 16:43:57 +0000 Subject: [petsc-users] Petsc Documentation In-Reply-To: <82a97890-1fd2-2c1f-8e5a-b86210fd39cc@mcs.anl.gov> References: <82a97890-1fd2-2c1f-8e5a-b86210fd39cc@mcs.anl.gov> Message-ID: My bad for not noticing the magnifying glass in the corner. If you just click on the link single index of all PETSc man pages though and then scroll to the Ps, there is no indication of any extra columns and no horizontal slider bar either. Might be good to put one in. Bruce From: Satish Balay Date: Monday, November 7, 2022 at 8:30 AM To: Palmer, Bruce J Cc: 'petsc-users at mcs.anl.gov' Subject: Re: [petsc-users] Petsc Documentation I see the search option on the top right of https://gcc02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fpetsc.org%2Frelease%2F&data=05%7C01%7CBruce.Palmer%40pnnl.gov%7Ca687e10cb96442eabdd608dac0dd725e%7Cd6faa5f90ae240338c0130048a38deeb%7C0%7C0%7C638034354592315538%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=wbT8SGAWgY0HLAF58Tn94D9hLT5%2FPgcMCfTX5VEKdjA%3D&reserved=0 [and says ctrl-k is the key binding] Also google search usually gets to the man page. And on the 'single index' page - looks like the visible width is small [compared to whats there] - Ctrl-F is able to find PetscPrintf Satish On Mon, 7 Nov 2022, Palmer, Bruce J via petsc-users wrote: > Hi, > > The new Petsc documentation pages don't seem to have a search function. Would it be possible to add one? I was looking around for the documentation on PetscPrintf and couldn't find it, even on the single index of all petsc man pages. > > Bruce Palmer > Computer Scientist > Pacific Northwest National Laboratory > (509) 375-3899 > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Bruce.Palmer at pnnl.gov Mon Nov 7 10:47:57 2022 From: Bruce.Palmer at pnnl.gov (Palmer, Bruce J) Date: Mon, 7 Nov 2022 16:47:57 +0000 Subject: [petsc-users] Petsc Documentation In-Reply-To: <82a97890-1fd2-2c1f-8e5a-b86210fd39cc@mcs.anl.gov> References: <82a97890-1fd2-2c1f-8e5a-b86210fd39cc@mcs.anl.gov> Message-ID: Hmm. Maybe it?s a Explorer thing. You can slide the pages left and right to pick up extra columns on Chrome. From: Satish Balay Date: Monday, November 7, 2022 at 8:30 AM To: Palmer, Bruce J Cc: 'petsc-users at mcs.anl.gov' Subject: Re: [petsc-users] Petsc Documentation I see the search option on the top right of https://gcc02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fpetsc.org%2Frelease%2F&data=05%7C01%7CBruce.Palmer%40pnnl.gov%7Ca687e10cb96442eabdd608dac0dd725e%7Cd6faa5f90ae240338c0130048a38deeb%7C0%7C0%7C638034354592315538%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=wbT8SGAWgY0HLAF58Tn94D9hLT5%2FPgcMCfTX5VEKdjA%3D&reserved=0 [and says ctrl-k is the key binding] Also google search usually gets to the man page. And on the 'single index' page - looks like the visible width is small [compared to whats there] - Ctrl-F is able to find PetscPrintf Satish On Mon, 7 Nov 2022, Palmer, Bruce J via petsc-users wrote: > Hi, > > The new Petsc documentation pages don't seem to have a search function. Would it be possible to add one? I was looking around for the documentation on PetscPrintf and couldn't find it, even on the single index of all petsc man pages. > > Bruce Palmer > Computer Scientist > Pacific Northwest National Laboratory > (509) 375-3899 > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From longtuteng249 at gmail.com Mon Nov 7 10:49:06 2022 From: longtuteng249 at gmail.com (Jianbo Long) Date: Mon, 7 Nov 2022 17:49:06 +0100 Subject: [petsc-users] [petsc-maint] Issues linking petsc header files and lib from FORTRAN codes In-Reply-To: <38802c8b-fca9-a502-57d1-7d52062662ea@mcs.anl.gov> References: <38802c8b-fca9-a502-57d1-7d52062662ea@mcs.anl.gov> Message-ID: Hi Satish and Barry, Thanks very much for the feedback ! It looks like my include file path was not correct ! Bests, Jianbo On Fri, Nov 4, 2022 at 6:08 AM Satish Balay wrote: > For ex83f.F90: > > >>>>> > balay at p1 /home/balay/test > $ ls > ex83f.F90 > balay at p1 /home/balay/test > $ ls > ex83f.F90 > balay at p1 /home/balay/test > $ export PETSC_DIR=$HOME/petsc > balay at p1 /home/balay/test > $ cp $PETSC_DIR/src/ksp/ksp/tests/makefile . > balay at p1 /home/balay/test > $ make ex83f > mpif90 -fPIC -Wall -ffree-line-length-none -ffree-line-length-0 > -Wno-lto-type-mismatch -Wno-unused-dummy-argument -g -O0 > -I/home/balay/petsc/include > -I/home/balay/petsc/arch-linux-c-debug/include ex83f.F90 > -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib > -L/home/balay/petsc/arch-linux-c-debug/lib > -Wl,-rpath,/home/balay/soft/mpich-4.0.1/lib > -L/home/balay/soft/mpich-4.0.1/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/12 > -L/usr/lib/gcc/x86_64-redhat-linux/12 -lpetsc -llapack -lblas -lm -lX11 > -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s > -lquadmath -lstdc++ -ldl -o ex83f > balay at p1 /home/balay/test > $ > <<<<<< > > Also when you are adding PETSc to your current project - are you using > source files with .f or .f90 suffix? If so rename them to .F or .F90 suffix. > > If you still have issues send more details - As Barry indicated - the > makefile [with the sources compiled by this makefile] - and the compile log > when you attempt to build these sources with this makefile. > > Satish > > On Thu, 3 Nov 2022, Barry Smith wrote: > > > > > Please send your attempted makefile and we'll see if we can get it > working. > > > > I am not sure if we can organize the include files as Fortran compiler > include files easily. We've always used the preprocessor approach. The > Intel compiler docs indicate the procedure for finding the Fortran compiler > include files > https://www.intel.com/content/www/us/en/develop/documentation/fortran-compiler-oneapi-dev-guide-and-reference/top/program-structure/use-include-files.html > is the same as for the preprocessor include files so I don't understand how > the using the Fortran compiler include file approach would make the > makefiles any simpler for users? > > > > > > Barry > > > > > > > On Nov 3, 2022, at 8:58 PM, Jianbo Long > wrote: > > > > > > Hello, > > > > > > I'm struggling to make my FORTRAN code work with petsc as I cannot > link the required header files (e.g., petscksp.h) and compiled library > files to my FORTRAN code. > > > > > > Compiling petsc was not a problem. However, even with the fortran > examples (see those on https://petsc.org/main/docs/manual/fortran/) and > the guide on using petsc in c++ and fortran codes (see Section "Writing > C/C++ or Fortran Applications" at > https://petsc.org/main/docs/manual/getting_started/), I still cannot make > my FORTRAN code work. > > > > > > The Fortran test code is exactly the example code ex83f.F90 (see > attached files). Aftering following the 2nd method in the Guide (see the > picture below), I still get errors: > > > > > > petsc/finclude/petscksp.h: No such file or directory > > > > > > Even if I set up the path of the header file correctly in my own > makefile without using environment variables, I still can only find the > file "petscksp.h" for my code. Of course, the trouble is that all other > headers files required by KSP are recursively included in this petscksp.h > file, and I have no way to link them together for my Fortran code. > > > > > > So, here are my questions: > > > 1) in the Guide, how exactly are we supposed to set up the environment > variables PETSC_DIR and PETSC_ARCH ? More details and examples would be > extremely helpful ! > > > 2) Is there a way to get rid of the preprocessor statement > > > #include > > > when using c++/Fortran codes ? > > > > > > For example, when using MUMPS package in a Fortran code, we can simply > use compiler 'include', rather than a preprocessor, to extract all required > variables for the user's codes : > > > INCLUDE 'zmumps_struc.h' > > > where the header file zmumps_struc.h is already provided in the > package. Similarly, I think it's much more portable and easier when using > petsc in other codes if we can make it work to use petsc. > > > > > > (Note: similar issues were discussed before, see > https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2019-January/037499.html. > Unfortunately, I have no clue about the solution archived there ...) > > > > > > Any thoughts and solutions would be much appreciated ! > > > > > > Thanks, > > > Jianbo Long > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon Nov 7 10:51:45 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 7 Nov 2022 10:51:45 -0600 (CST) Subject: [petsc-users] Petsc Documentation In-Reply-To: References: <82a97890-1fd2-2c1f-8e5a-b86210fd39cc@mcs.anl.gov> Message-ID: <765cf31f-a5c5-2bf6-6163-b8f51dbd3840@mcs.anl.gov> The horizontal scrollbar is at the very bottom of the page [for both firefox and chrome] - so its not really helping.. Perhaps limiting this page to 2 columns [instead of 3] will help.. Satish On Mon, 7 Nov 2022, Palmer, Bruce J via petsc-users wrote: > Hmm. Maybe it?s a Explorer thing. You can slide the pages left and right to pick up extra columns on Chrome. > > From: Satish Balay > Date: Monday, November 7, 2022 at 8:30 AM > To: Palmer, Bruce J > Cc: 'petsc-users at mcs.anl.gov' > Subject: Re: [petsc-users] Petsc Documentation > I see the search option on the top right of https://gcc02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fpetsc.org%2Frelease%2F&data=05%7C01%7CBruce.Palmer%40pnnl.gov%7Ca687e10cb96442eabdd608dac0dd725e%7Cd6faa5f90ae240338c0130048a38deeb%7C0%7C0%7C638034354592315538%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=wbT8SGAWgY0HLAF58Tn94D9hLT5%2FPgcMCfTX5VEKdjA%3D&reserved=0 [and says ctrl-k is the key binding] > > Also google search usually gets to the man page. > > And on the 'single index' page - looks like the visible width is small [compared to whats there] - Ctrl-F is able to find PetscPrintf > > Satish > > On Mon, 7 Nov 2022, Palmer, Bruce J via petsc-users wrote: > > > Hi, > > > > The new Petsc documentation pages don't seem to have a search function. Would it be possible to add one? I was looking around for the documentation on PetscPrintf and couldn't find it, even on the single index of all petsc man pages. > > > > Bruce Palmer > > Computer Scientist > > Pacific Northwest National Laboratory > > (509) 375-3899 > > > > > From Bruce.Palmer at pnnl.gov Mon Nov 7 10:55:00 2022 From: Bruce.Palmer at pnnl.gov (Palmer, Bruce J) Date: Mon, 7 Nov 2022 16:55:00 +0000 Subject: [petsc-users] Petsc Documentation In-Reply-To: <765cf31f-a5c5-2bf6-6163-b8f51dbd3840@mcs.anl.gov> References: <82a97890-1fd2-2c1f-8e5a-b86210fd39cc@mcs.anl.gov> <765cf31f-a5c5-2bf6-6163-b8f51dbd3840@mcs.anl.gov> Message-ID: Okay, I see it on explorer. For explorer there is an awful lot of white space on the right. Maybe getting rid of that would help. From: Satish Balay Date: Monday, November 7, 2022 at 8:51 AM To: Palmer, Bruce J Cc: petsc-users Subject: Re: [petsc-users] Petsc Documentation The horizontal scrollbar is at the very bottom of the page [for both firefox and chrome] - so its not really helping.. Perhaps limiting this page to 2 columns [instead of 3] will help.. Satish On Mon, 7 Nov 2022, Palmer, Bruce J via petsc-users wrote: > Hmm. Maybe it?s a Explorer thing. You can slide the pages left and right to pick up extra columns on Chrome. > > From: Satish Balay > Date: Monday, November 7, 2022 at 8:30 AM > To: Palmer, Bruce J > Cc: 'petsc-users at mcs.anl.gov' > Subject: Re: [petsc-users] Petsc Documentation > I see the search option on the top right of https://gcc02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fpetsc.org%2Frelease%2F&data=05%7C01%7CBruce.Palmer%40pnnl.gov%7C21cb8d9773884221404908dac0e05c7f%7Cd6faa5f90ae240338c0130048a38deeb%7C0%7C0%7C638034367111910491%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=Y0kOWdbY12sl%2FZvvtyr3KObZDWYWg2MHaPdBgabWftE%3D&reserved=0 [and says ctrl-k is the key binding] > > Also google search usually gets to the man page. > > And on the 'single index' page - looks like the visible width is small [compared to whats there] - Ctrl-F is able to find PetscPrintf > > Satish > > On Mon, 7 Nov 2022, Palmer, Bruce J via petsc-users wrote: > > > Hi, > > > > The new Petsc documentation pages don't seem to have a search function. Would it be possible to add one? I was looking around for the documentation on PetscPrintf and couldn't find it, even on the single index of all petsc man pages. > > > > Bruce Palmer > > Computer Scientist > > Pacific Northwest National Laboratory > > (509) 375-3899 > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon Nov 7 11:01:03 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 7 Nov 2022 11:01:03 -0600 (CST) Subject: [petsc-users] [petsc-maint] Issues linking petsc header files and lib from FORTRAN codes In-Reply-To: References: <38802c8b-fca9-a502-57d1-7d52062662ea@mcs.anl.gov> Message-ID: Glad you have it working. Thanks for the update. Satish On Mon, 7 Nov 2022, Jianbo Long wrote: > Hi Satish and Barry, > > Thanks very much for the feedback ! > > It looks like my include file path was not correct ! > > Bests, > Jianbo > > > On Fri, Nov 4, 2022 at 6:08 AM Satish Balay wrote: > > > For ex83f.F90: > > > > >>>>> > > balay at p1 /home/balay/test > > $ ls > > ex83f.F90 > > balay at p1 /home/balay/test > > $ ls > > ex83f.F90 > > balay at p1 /home/balay/test > > $ export PETSC_DIR=$HOME/petsc > > balay at p1 /home/balay/test > > $ cp $PETSC_DIR/src/ksp/ksp/tests/makefile . > > balay at p1 /home/balay/test > > $ make ex83f > > mpif90 -fPIC -Wall -ffree-line-length-none -ffree-line-length-0 > > -Wno-lto-type-mismatch -Wno-unused-dummy-argument -g -O0 > > -I/home/balay/petsc/include > > -I/home/balay/petsc/arch-linux-c-debug/include ex83f.F90 > > -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib > > -L/home/balay/petsc/arch-linux-c-debug/lib > > -Wl,-rpath,/home/balay/soft/mpich-4.0.1/lib > > -L/home/balay/soft/mpich-4.0.1/lib > > -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/12 > > -L/usr/lib/gcc/x86_64-redhat-linux/12 -lpetsc -llapack -lblas -lm -lX11 > > -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s > > -lquadmath -lstdc++ -ldl -o ex83f > > balay at p1 /home/balay/test > > $ > > <<<<<< > > > > Also when you are adding PETSc to your current project - are you using > > source files with .f or .f90 suffix? If so rename them to .F or .F90 suffix. > > > > If you still have issues send more details - As Barry indicated - the > > makefile [with the sources compiled by this makefile] - and the compile log > > when you attempt to build these sources with this makefile. > > > > Satish > > > > On Thu, 3 Nov 2022, Barry Smith wrote: > > > > > > > > Please send your attempted makefile and we'll see if we can get it > > working. > > > > > > I am not sure if we can organize the include files as Fortran compiler > > include files easily. We've always used the preprocessor approach. The > > Intel compiler docs indicate the procedure for finding the Fortran compiler > > include files > > https://www.intel.com/content/www/us/en/develop/documentation/fortran-compiler-oneapi-dev-guide-and-reference/top/program-structure/use-include-files.html > > is the same as for the preprocessor include files so I don't understand how > > the using the Fortran compiler include file approach would make the > > makefiles any simpler for users? > > > > > > > > > Barry > > > > > > > > > > On Nov 3, 2022, at 8:58 PM, Jianbo Long > > wrote: > > > > > > > > Hello, > > > > > > > > I'm struggling to make my FORTRAN code work with petsc as I cannot > > link the required header files (e.g., petscksp.h) and compiled library > > files to my FORTRAN code. > > > > > > > > Compiling petsc was not a problem. However, even with the fortran > > examples (see those on https://petsc.org/main/docs/manual/fortran/) and > > the guide on using petsc in c++ and fortran codes (see Section "Writing > > C/C++ or Fortran Applications" at > > https://petsc.org/main/docs/manual/getting_started/), I still cannot make > > my FORTRAN code work. > > > > > > > > The Fortran test code is exactly the example code ex83f.F90 (see > > attached files). Aftering following the 2nd method in the Guide (see the > > picture below), I still get errors: > > > > > > > > petsc/finclude/petscksp.h: No such file or directory > > > > > > > > Even if I set up the path of the header file correctly in my own > > makefile without using environment variables, I still can only find the > > file "petscksp.h" for my code. Of course, the trouble is that all other > > headers files required by KSP are recursively included in this petscksp.h > > file, and I have no way to link them together for my Fortran code. > > > > > > > > So, here are my questions: > > > > 1) in the Guide, how exactly are we supposed to set up the environment > > variables PETSC_DIR and PETSC_ARCH ? More details and examples would be > > extremely helpful ! > > > > 2) Is there a way to get rid of the preprocessor statement > > > > #include > > > > when using c++/Fortran codes ? > > > > > > > > For example, when using MUMPS package in a Fortran code, we can simply > > use compiler 'include', rather than a preprocessor, to extract all required > > variables for the user's codes : > > > > INCLUDE 'zmumps_struc.h' > > > > where the header file zmumps_struc.h is already provided in the > > package. Similarly, I think it's much more portable and easier when using > > petsc in other codes if we can make it work to use petsc. > > > > > > > > (Note: similar issues were discussed before, see > > https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2019-January/037499.html. > > Unfortunately, I have no clue about the solution archived there ...) > > > > > > > > Any thoughts and solutions would be much appreciated ! > > > > > > > > Thanks, > > > > Jianbo Long > > > > > > > > > > > > > > > > > > > > > > > From knepley at gmail.com Mon Nov 7 11:14:33 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 7 Nov 2022 12:14:33 -0500 Subject: [petsc-users] Dof ordering in DMPlexVecGet/Set/RestoreClosure In-Reply-To: <511602F6-5FC4-4B67-91FB-044A1FD2B561@mcmaster.ca> References: <511602F6-5FC4-4B67-91FB-044A1FD2B561@mcmaster.ca> Message-ID: On Mon, Nov 7, 2022 at 10:51 AM Blaise Bourdin wrote: > Hi, > > How are degree of freedom ordered in calls to > DMPlexVec[Set/Get/Restore]Closure? > Given a FE mesh and a section for P2-Lagrange elements (i.e. 1 dof per > vertex and edge), I was naively assuming that the ?local? vector would > contain dof at vertices then edges (in this order), since it matches the > section ordering, but it looks like I get edges then vertices? > > Here is my section (my mesh has 8 cells, 16 edges, and 9 vertices) > PetscSection Object: U 1 MPI process > type not yet set > 1 fields > field 0 with 1 components > Process 0: > ( 0) dim 0 offset 0 > ( 1) dim 0 offset 0 > ( 2) dim 0 offset 0 > ( 3) dim 0 offset 0 > ( 4) dim 0 offset 0 > ( 5) dim 0 offset 0 > ( 6) dim 0 offset 0 > ( 7) dim 0 offset 0 > ( 8) dim 1 offset 0 > ( 9) dim 1 offset 1 > ( 10) dim 1 offset 2 > ( 11) dim 1 offset 3 > ( 12) dim 1 offset 4 > ( 13) dim 1 offset 5 > ( 14) dim 1 offset 6 > ( 15) dim 1 offset 7 > ( 16) dim 1 offset 8 > ( 17) dim 1 offset 9 > ( 18) dim 1 offset 10 > ( 19) dim 1 offset 11 > ( 20) dim 1 offset 12 > ( 21) dim 1 offset 13 > ( 22) dim 1 offset 14 > ( 23) dim 1 offset 15 > ( 24) dim 1 offset 16 > ( 25) dim 1 offset 17 > ( 26) dim 1 offset 18 > ( 27) dim 1 offset 19 > ( 28) dim 1 offset 20 > ( 29) dim 1 offset 21 > ( 30) dim 1 offset 22 > ( 31) dim 1 offset 23 > ( 32) dim 1 offset 24 > > Start from the following local vector: > Vec Object: U 1 MPI process > type: seq > 1. > 2. > 3. > 4. > 5. > 6. > 7. > 8. > 9. > 10. > 11. > 12. > 13. > 14. > 15. > 16. > 17. > 18. > 19. > 20. > 21. > 22. > 23. > 24. > 25. > > Call PetscCallA(DMPlexVecGetClosure(dmU,sectionU,U,0_Ki,UArray,ierr)) and > get the following for Array: > 10.000000000000000 11.000000000000000 12.000000000000000 > 1.0000000000000000 2.0000000000000000 3.0000000000000000 > > Is this ordering predictable and documented somewhere? Is it ordered by > stratum? > The ordering is determined. It follows the same order that DMPlexGetTransitiveClosure() gives for the points. Transitive closure orders points by stratum. It is a BFS of the Hasse Diagram, starting from the initial point. For your triangle, it would be tri0, e0, e1, e2, v0, v1, v2 For each cell type, the order of faces is specified in Table 1.2 of the attached. This gives an order to each level of the BFS. Thanks, Matt > Regards, > Blaise > > ? > Canada Research Chair in Mathematical and Computational Aspects of Solid > Mechanics (Tier 1) > Professor, Department of Mathematics & Statistics > Hamilton Hall room 409A, McMaster University > 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada > https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: PlexBook.pdf Type: application/pdf Size: 2737694 bytes Desc: not available URL: From bourdin at mcmaster.ca Mon Nov 7 11:44:49 2022 From: bourdin at mcmaster.ca (Blaise Bourdin) Date: Mon, 7 Nov 2022 17:44:49 +0000 Subject: [petsc-users] Dof ordering in DMPlexVecGet/Set/RestoreClosure In-Reply-To: References: <511602F6-5FC4-4B67-91FB-044A1FD2B561@mcmaster.ca> Message-ID: <0AB629D2-7C50-4BCB-8A23-45FB2756238D@mcmaster.ca> An HTML attachment was scrubbed... URL: From hng.email at gmail.com Mon Nov 7 11:48:16 2022 From: hng.email at gmail.com (Hom Nath Gharti) Date: Mon, 7 Nov 2022 12:48:16 -0500 Subject: [petsc-users] Error configuring external packages Message-ID: Dear all, I am trying to compile the latest version, but I am getting the following error: ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- Error configuring METIS with CMake ******************************************************************************* When I remove this package from the configure option, I get the same error for other packages. Is there something wrong with my configure command or compilers? Attached is the configure log file. Best, Hom Nath -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: text/x-log Size: 793379 bytes Desc: not available URL: From longtuteng249 at gmail.com Mon Nov 7 11:59:30 2022 From: longtuteng249 at gmail.com (Jianbo Long) Date: Mon, 7 Nov 2022 18:59:30 +0100 Subject: [petsc-users] [petsc-maint] Issues linking petsc header files and lib from FORTRAN codes In-Reply-To: References: <38802c8b-fca9-a502-57d1-7d52062662ea@mcs.anl.gov> Message-ID: Hi Satish, I wonder if you know anything about another issue: after compiling petsc on a cluster, when I tried to link my Fortran code with compiled libpetsc.so, the shared library, I got the following errors: /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.29' not found (required by /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: /lib64/libstdc++.so.6: version `CXXABI_1.3.8' not found (required by /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) Not sure if it is related to discussion in this post ( https://gitlab.com/petsc/petsc/-/issues/997), but after I tried the configure option --with-cxx=0, I still got the same errors. My make.log file for compiling petsc is attached here. Also, the dependencies of the compiled petsc is: >>: ldd arch-linux-c-debug/lib/libpetsc.so linux-vdso.so.1 => (0x00007ffd80348000) libflexiblas.so.3 => /cluster/software/FlexiBLAS/3.0.4-GCC-11.2.0/lib/libflexiblas.so.3 (0x00007f6e8b93f000) libpthread.so.0 => /usr/lib64/libpthread.so.0 (0x00007f6e8b723000) libm.so.6 => /usr/lib64/libm.so.6 (0x00007f6e8b421000) libdl.so.2 => /usr/lib64/libdl.so.2 (0x00007f6e8b21d000) libmpi_usempif08.so.40 => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempif08.so.40 (0x00007f6e8fd92000) libmpi_usempi_ignore_tkr.so.40 => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempi_ignore_tkr.so.40 (0x00007f6e8fd84000) libmpi_mpifh.so.40 => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_mpifh.so.40 (0x00007f6e8fd0c000) libmpi.so.40 => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi.so.40 (0x00007f6e8fbfa000) libgfortran.so.5 => /cluster/software/GCCcore/11.2.0/lib64/libgfortran.so.5 (0x00007f6e8af70000) libgcc_s.so.1 => /cluster/software/GCCcore/11.2.0/lib64/libgcc_s.so.1 (0x00007f6e8fbe0000) libquadmath.so.0 => /cluster/software/GCCcore/11.2.0/lib64/libquadmath.so.0 (0x00007f6e8af28000) libc.so.6 => /usr/lib64/libc.so.6 (0x00007f6e8ab5a000) /lib64/ld-linux-x86-64.so.2 (0x00007f6e8fbb3000) libopen-rte.so.40 => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-rte.so.40 (0x00007f6e8aa9e000) libopen-orted-mpir.so => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-orted-mpir.so (0x00007f6e8fbdb000) libopen-pal.so.40 => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-pal.so.40 (0x00007f6e8a9ea000) librt.so.1 => /lib64/librt.so.1 (0x00007f6e8a7d5000) libutil.so.1 => /lib64/libutil.so.1 (0x00007f6e8a5d2000) libhwloc.so.15 => /cluster/software/hwloc/2.5.0-GCCcore-11.2.0/lib/libhwloc.so.15 (0x00007f6e8a575000) libpciaccess.so.0 => /cluster/software/libpciaccess/0.16-GCCcore-11.2.0/lib/libpciaccess.so.0 (0x00007f6e8a56a000) libxml2.so.2 => /cluster/software/libxml2/2.9.10-GCCcore-11.2.0/lib/libxml2.so.2 (0x00007f6e8a3f6000) libz.so.1 => /cluster/software/zlib/1.2.11-GCCcore-11.2.0/lib/libz.so.1 (0x00007f6e8a3dd000) liblzma.so.5 => /cluster/software/XZ/5.2.5-GCCcore-11.2.0/lib/liblzma.so.5 (0x00007f6e8a3b5000) libevent_core-2.0.so.5 => /lib64/libevent_core-2.0.so.5 (0x00007f6e8a18a000) libevent_pthreads-2.0.so.5 => /lib64/libevent_pthreads-2.0.so.5 (0x00007f6e89f87000) Thanks very much, Jianbo On Mon, Nov 7, 2022 at 6:01 PM Satish Balay wrote: > Glad you have it working. Thanks for the update. > > Satish > > On Mon, 7 Nov 2022, Jianbo Long wrote: > > > Hi Satish and Barry, > > > > Thanks very much for the feedback ! > > > > It looks like my include file path was not correct ! > > > > Bests, > > Jianbo > > > > > > On Fri, Nov 4, 2022 at 6:08 AM Satish Balay wrote: > > > > > For ex83f.F90: > > > > > > >>>>> > > > balay at p1 /home/balay/test > > > $ ls > > > ex83f.F90 > > > balay at p1 /home/balay/test > > > $ ls > > > ex83f.F90 > > > balay at p1 /home/balay/test > > > $ export PETSC_DIR=$HOME/petsc > > > balay at p1 /home/balay/test > > > $ cp $PETSC_DIR/src/ksp/ksp/tests/makefile . > > > balay at p1 /home/balay/test > > > $ make ex83f > > > mpif90 -fPIC -Wall -ffree-line-length-none -ffree-line-length-0 > > > -Wno-lto-type-mismatch -Wno-unused-dummy-argument -g -O0 > > > -I/home/balay/petsc/include > > > -I/home/balay/petsc/arch-linux-c-debug/include ex83f.F90 > > > -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib > > > -L/home/balay/petsc/arch-linux-c-debug/lib > > > -Wl,-rpath,/home/balay/soft/mpich-4.0.1/lib > > > -L/home/balay/soft/mpich-4.0.1/lib > > > -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/12 > > > -L/usr/lib/gcc/x86_64-redhat-linux/12 -lpetsc -llapack -lblas -lm -lX11 > > > -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s > > > -lquadmath -lstdc++ -ldl -o ex83f > > > balay at p1 /home/balay/test > > > $ > > > <<<<<< > > > > > > Also when you are adding PETSc to your current project - are you using > > > source files with .f or .f90 suffix? If so rename them to .F or .F90 > suffix. > > > > > > If you still have issues send more details - As Barry indicated - the > > > makefile [with the sources compiled by this makefile] - and the > compile log > > > when you attempt to build these sources with this makefile. > > > > > > Satish > > > > > > On Thu, 3 Nov 2022, Barry Smith wrote: > > > > > > > > > > > Please send your attempted makefile and we'll see if we can get it > > > working. > > > > > > > > I am not sure if we can organize the include files as Fortran > compiler > > > include files easily. We've always used the preprocessor approach. The > > > Intel compiler docs indicate the procedure for finding the Fortran > compiler > > > include files > > > > https://www.intel.com/content/www/us/en/develop/documentation/fortran-compiler-oneapi-dev-guide-and-reference/top/program-structure/use-include-files.html > > > is the same as for the preprocessor include files so I don't > understand how > > > the using the Fortran compiler include file approach would make the > > > makefiles any simpler for users? > > > > > > > > > > > > Barry > > > > > > > > > > > > > On Nov 3, 2022, at 8:58 PM, Jianbo Long > > > wrote: > > > > > > > > > > Hello, > > > > > > > > > > I'm struggling to make my FORTRAN code work with petsc as I cannot > > > link the required header files (e.g., petscksp.h) and compiled library > > > files to my FORTRAN code. > > > > > > > > > > Compiling petsc was not a problem. However, even with the fortran > > > examples (see those on https://petsc.org/main/docs/manual/fortran/) > and > > > the guide on using petsc in c++ and fortran codes (see Section "Writing > > > C/C++ or Fortran Applications" at > > > https://petsc.org/main/docs/manual/getting_started/), I still cannot > make > > > my FORTRAN code work. > > > > > > > > > > The Fortran test code is exactly the example code ex83f.F90 (see > > > attached files). Aftering following the 2nd method in the Guide (see > the > > > picture below), I still get errors: > > > > > > > > > > petsc/finclude/petscksp.h: No such file or directory > > > > > > > > > > Even if I set up the path of the header file correctly in my own > > > makefile without using environment variables, I still can only find the > > > file "petscksp.h" for my code. Of course, the trouble is that all other > > > headers files required by KSP are recursively included in this > petscksp.h > > > file, and I have no way to link them together for my Fortran code. > > > > > > > > > > So, here are my questions: > > > > > 1) in the Guide, how exactly are we supposed to set up the > environment > > > variables PETSC_DIR and PETSC_ARCH ? More details and examples would > be > > > extremely helpful ! > > > > > 2) Is there a way to get rid of the preprocessor statement > > > > > #include > > > > > when using c++/Fortran codes ? > > > > > > > > > > For example, when using MUMPS package in a Fortran code, we can > simply > > > use compiler 'include', rather than a preprocessor, to extract all > required > > > variables for the user's codes : > > > > > INCLUDE 'zmumps_struc.h' > > > > > where the header file zmumps_struc.h is already provided in the > > > package. Similarly, I think it's much more portable and easier when > using > > > petsc in other codes if we can make it work to use petsc. > > > > > > > > > > (Note: similar issues were discussed before, see > > > > https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2019-January/037499.html > . > > > Unfortunately, I have no clue about the solution archived there ...) > > > > > > > > > > Any thoughts and solutions would be much appreciated ! > > > > > > > > > > Thanks, > > > > > Jianbo Long > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log Type: text/x-log Size: 116528 bytes Desc: not available URL: From balay at mcs.anl.gov Mon Nov 7 12:10:11 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 7 Nov 2022 12:10:11 -0600 (CST) Subject: [petsc-users] [petsc-maint] Issues linking petsc header files and lib from FORTRAN codes In-Reply-To: References: <38802c8b-fca9-a502-57d1-7d52062662ea@mcs.anl.gov> Message-ID: <68428260-9036-a81b-8a79-b6daf14667c0@mcs.anl.gov> Likely due to mixing c++ codes compiled with /usr/bin/g++ and compilers in /cluster/software/GCCcore/11.2.0 if you still get this with --with-cxx=0 - then the issue with some other [non-petsc library] Satish On Mon, 7 Nov 2022, Jianbo Long wrote: > Hi Satish, > > I wonder if you know anything about another issue: after compiling petsc on > a cluster, when I tried to link my Fortran code with compiled libpetsc.so, > the shared library, I got the following errors: > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.29' not found (required by > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > /lib64/libstdc++.so.6: version `CXXABI_1.3.8' not found (required by > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > Not sure if it is related to discussion in this post ( > https://gitlab.com/petsc/petsc/-/issues/997), but after I tried the > configure option --with-cxx=0, I still got the same errors. > My make.log file for compiling petsc is attached here. Also, the > dependencies of the compiled petsc is: > > >>: ldd arch-linux-c-debug/lib/libpetsc.so > linux-vdso.so.1 => (0x00007ffd80348000) > libflexiblas.so.3 => > /cluster/software/FlexiBLAS/3.0.4-GCC-11.2.0/lib/libflexiblas.so.3 > (0x00007f6e8b93f000) > libpthread.so.0 => /usr/lib64/libpthread.so.0 (0x00007f6e8b723000) > libm.so.6 => /usr/lib64/libm.so.6 (0x00007f6e8b421000) > libdl.so.2 => /usr/lib64/libdl.so.2 (0x00007f6e8b21d000) > libmpi_usempif08.so.40 => > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempif08.so.40 > (0x00007f6e8fd92000) > libmpi_usempi_ignore_tkr.so.40 => > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempi_ignore_tkr.so.40 > (0x00007f6e8fd84000) > libmpi_mpifh.so.40 => > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_mpifh.so.40 > (0x00007f6e8fd0c000) > libmpi.so.40 => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi.so.40 > (0x00007f6e8fbfa000) > libgfortran.so.5 => /cluster/software/GCCcore/11.2.0/lib64/libgfortran.so.5 > (0x00007f6e8af70000) > libgcc_s.so.1 => /cluster/software/GCCcore/11.2.0/lib64/libgcc_s.so.1 > (0x00007f6e8fbe0000) > libquadmath.so.0 => /cluster/software/GCCcore/11.2.0/lib64/libquadmath.so.0 > (0x00007f6e8af28000) > libc.so.6 => /usr/lib64/libc.so.6 (0x00007f6e8ab5a000) > /lib64/ld-linux-x86-64.so.2 (0x00007f6e8fbb3000) > libopen-rte.so.40 => > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-rte.so.40 > (0x00007f6e8aa9e000) > libopen-orted-mpir.so => > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-orted-mpir.so > (0x00007f6e8fbdb000) > libopen-pal.so.40 => > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-pal.so.40 > (0x00007f6e8a9ea000) > librt.so.1 => /lib64/librt.so.1 (0x00007f6e8a7d5000) > libutil.so.1 => /lib64/libutil.so.1 (0x00007f6e8a5d2000) > libhwloc.so.15 => > /cluster/software/hwloc/2.5.0-GCCcore-11.2.0/lib/libhwloc.so.15 > (0x00007f6e8a575000) > libpciaccess.so.0 => > /cluster/software/libpciaccess/0.16-GCCcore-11.2.0/lib/libpciaccess.so.0 > (0x00007f6e8a56a000) > libxml2.so.2 => > /cluster/software/libxml2/2.9.10-GCCcore-11.2.0/lib/libxml2.so.2 > (0x00007f6e8a3f6000) > libz.so.1 => /cluster/software/zlib/1.2.11-GCCcore-11.2.0/lib/libz.so.1 > (0x00007f6e8a3dd000) > liblzma.so.5 => /cluster/software/XZ/5.2.5-GCCcore-11.2.0/lib/liblzma.so.5 > (0x00007f6e8a3b5000) > libevent_core-2.0.so.5 => /lib64/libevent_core-2.0.so.5 (0x00007f6e8a18a000) > libevent_pthreads-2.0.so.5 => /lib64/libevent_pthreads-2.0.so.5 > (0x00007f6e89f87000) > > Thanks very much, > Jianbo > > On Mon, Nov 7, 2022 at 6:01 PM Satish Balay wrote: > > > Glad you have it working. Thanks for the update. > > > > Satish > > > > On Mon, 7 Nov 2022, Jianbo Long wrote: > > > > > Hi Satish and Barry, > > > > > > Thanks very much for the feedback ! > > > > > > It looks like my include file path was not correct ! > > > > > > Bests, > > > Jianbo > > > > > > > > > On Fri, Nov 4, 2022 at 6:08 AM Satish Balay wrote: > > > > > > > For ex83f.F90: > > > > > > > > >>>>> > > > > balay at p1 /home/balay/test > > > > $ ls > > > > ex83f.F90 > > > > balay at p1 /home/balay/test > > > > $ ls > > > > ex83f.F90 > > > > balay at p1 /home/balay/test > > > > $ export PETSC_DIR=$HOME/petsc > > > > balay at p1 /home/balay/test > > > > $ cp $PETSC_DIR/src/ksp/ksp/tests/makefile . > > > > balay at p1 /home/balay/test > > > > $ make ex83f > > > > mpif90 -fPIC -Wall -ffree-line-length-none -ffree-line-length-0 > > > > -Wno-lto-type-mismatch -Wno-unused-dummy-argument -g -O0 > > > > -I/home/balay/petsc/include > > > > -I/home/balay/petsc/arch-linux-c-debug/include ex83f.F90 > > > > -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib > > > > -L/home/balay/petsc/arch-linux-c-debug/lib > > > > -Wl,-rpath,/home/balay/soft/mpich-4.0.1/lib > > > > -L/home/balay/soft/mpich-4.0.1/lib > > > > -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/12 > > > > -L/usr/lib/gcc/x86_64-redhat-linux/12 -lpetsc -llapack -lblas -lm -lX11 > > > > -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s > > > > -lquadmath -lstdc++ -ldl -o ex83f > > > > balay at p1 /home/balay/test > > > > $ > > > > <<<<<< > > > > > > > > Also when you are adding PETSc to your current project - are you using > > > > source files with .f or .f90 suffix? If so rename them to .F or .F90 > > suffix. > > > > > > > > If you still have issues send more details - As Barry indicated - the > > > > makefile [with the sources compiled by this makefile] - and the > > compile log > > > > when you attempt to build these sources with this makefile. > > > > > > > > Satish > > > > > > > > On Thu, 3 Nov 2022, Barry Smith wrote: > > > > > > > > > > > > > > Please send your attempted makefile and we'll see if we can get it > > > > working. > > > > > > > > > > I am not sure if we can organize the include files as Fortran > > compiler > > > > include files easily. We've always used the preprocessor approach. The > > > > Intel compiler docs indicate the procedure for finding the Fortran > > compiler > > > > include files > > > > > > https://www.intel.com/content/www/us/en/develop/documentation/fortran-compiler-oneapi-dev-guide-and-reference/top/program-structure/use-include-files.html > > > > is the same as for the preprocessor include files so I don't > > understand how > > > > the using the Fortran compiler include file approach would make the > > > > makefiles any simpler for users? > > > > > > > > > > > > > > > Barry > > > > > > > > > > > > > > > > On Nov 3, 2022, at 8:58 PM, Jianbo Long > > > > wrote: > > > > > > > > > > > > Hello, > > > > > > > > > > > > I'm struggling to make my FORTRAN code work with petsc as I cannot > > > > link the required header files (e.g., petscksp.h) and compiled library > > > > files to my FORTRAN code. > > > > > > > > > > > > Compiling petsc was not a problem. However, even with the fortran > > > > examples (see those on https://petsc.org/main/docs/manual/fortran/) > > and > > > > the guide on using petsc in c++ and fortran codes (see Section "Writing > > > > C/C++ or Fortran Applications" at > > > > https://petsc.org/main/docs/manual/getting_started/), I still cannot > > make > > > > my FORTRAN code work. > > > > > > > > > > > > The Fortran test code is exactly the example code ex83f.F90 (see > > > > attached files). Aftering following the 2nd method in the Guide (see > > the > > > > picture below), I still get errors: > > > > > > > > > > > > petsc/finclude/petscksp.h: No such file or directory > > > > > > > > > > > > Even if I set up the path of the header file correctly in my own > > > > makefile without using environment variables, I still can only find the > > > > file "petscksp.h" for my code. Of course, the trouble is that all other > > > > headers files required by KSP are recursively included in this > > petscksp.h > > > > file, and I have no way to link them together for my Fortran code. > > > > > > > > > > > > So, here are my questions: > > > > > > 1) in the Guide, how exactly are we supposed to set up the > > environment > > > > variables PETSC_DIR and PETSC_ARCH ? More details and examples would > > be > > > > extremely helpful ! > > > > > > 2) Is there a way to get rid of the preprocessor statement > > > > > > #include > > > > > > when using c++/Fortran codes ? > > > > > > > > > > > > For example, when using MUMPS package in a Fortran code, we can > > simply > > > > use compiler 'include', rather than a preprocessor, to extract all > > required > > > > variables for the user's codes : > > > > > > INCLUDE 'zmumps_struc.h' > > > > > > where the header file zmumps_struc.h is already provided in the > > > > package. Similarly, I think it's much more portable and easier when > > using > > > > petsc in other codes if we can make it work to use petsc. > > > > > > > > > > > > (Note: similar issues were discussed before, see > > > > > > https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2019-January/037499.html > > . > > > > Unfortunately, I have no clue about the solution archived there ...) > > > > > > > > > > > > Any thoughts and solutions would be much appreciated ! > > > > > > > > > > > > Thanks, > > > > > > Jianbo Long > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > From narnoldm at umich.edu Mon Nov 7 12:15:21 2022 From: narnoldm at umich.edu (Nicholas Arnold-Medabalimi) Date: Mon, 7 Nov 2022 13:15:21 -0500 Subject: [petsc-users] Manually setting PetscSF Message-ID: Hi Petsc Users, I am working on setting up the star forest to connect my meshes on different processors. For a 2-processor example, I have set up a dmplex object and read in the coordinates nodes and cell2node charts on each one. But right now they are independent effectively. For simplicity lets just say each process has 4 cells. [image: image.png] I'm trying to build the star forest that sets the edge between the two partitions. (see fig) To my understanding, I do this using PetscSFSetGraph. I choose either 9-10 on rank 0 or 1-2 on rank 1 to be the "roots". For simplicity lets say the 1-2 on rank 1 are roots. (also for the rest of the discussion I'm switching 1 and 2 to be 0 and 1 and 9 and 10 to be 8 9 ) (the figure is 1 indexed but for the code below it will be 0) So on rank 0 I would set nleaves=2 nroots= 8+c1 (graph is only cells and verts so its 8 verts plus the cell count(c1) remote[0].rank=1 remote[0].index=0+(cell count on rank1) leaf[0]=8+c1 remote[1].rank=1 remote[1].index=0+(cell count on rank1) leaf[1]=9+c1 and on rank 1 we would set nroots= ncells+nverts nleaves=0 Since its all roots I don't think I need to set anything else? PetscSFSetGraph(sf,nroots,nleaves,leaves,PETSC_COPY_VALUES,remote,PETSC_COPY_VALUES) I am certain I am making a mistake somewhere since I get an error when I then call PetscSFSetup. I am working in C++ right now but this is just testing out before implementation in fortran which is why I am using PETSC_COPY. Any help and clarification would be appreciated. Sincerely Nicholas -- Nicholas Arnold-Medabalimi Ph.D. Candidate Computational Aeroscience Lab University of Michigan -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 18794 bytes Desc: not available URL: From bsmith at petsc.dev Mon Nov 7 12:19:27 2022 From: bsmith at petsc.dev (Barry Smith) Date: Mon, 7 Nov 2022 13:19:27 -0500 Subject: [petsc-users] Error configuring external packages In-Reply-To: References: Message-ID: <792FF04D-3C01-42D5-8C7E-CE88A1869211@petsc.dev> The cmake in your path is broken TESTING: locateCMake from config.packages.cmake(/gpfs/fs1/home/h/hngharti/hngharti/lsoft/petsc-gnu/config/BuildSystem/config/packages/cmake.py:53) Looking for default CMake executable Checking for program /scinet/niagara/software/2019b/opt/gcc-9.4.0/openmpi/4.1.1/bin/cmake...not found Checking for program /scinet/niagara/software/2019b/opt/base/gcc/9.4.0/bin/cmake...not found Checking for program /home/h/hngharti/hngharti/lsoft/cmake-3.23.1/bin/cmake...found Defined make macro "CMAKE" to "/home/h/hngharti/hngharti/lsoft/cmake-3.23.1/bin/cmake" Looking for default CTest executable Checking for program /scinet/niagara/software/2019b/opt/gcc-9.4.0/openmpi/4.1.1/bin/ctest...not found Checking for program /scinet/niagara/software/2019b/opt/base/gcc/9.4.0/bin/ctest...not found Checking for program /home/h/hngharti/hngharti/lsoft/cmake-3.23.1/bin/ctest...found Defined make macro "CTEST" to "/home/h/hngharti/hngharti/lsoft/cmake-3.23.1/bin/ctest" Executing: /home/h/hngharti/hngharti/lsoft/cmake-3.23.1/bin/cmake --version cmake --version failed: Could not execute "['/home/h/hngharti/hngharti/lsoft/cmake-3.23.1/bin/cmake --version']": /home/h/hngharti/hngharti/lsoft/cmake-3.23.1/bin/cmake: error while loading shared libraries: libfabric.so.1: cannot open shared object file: No such file or directory you need to either install a better cmake or have PETSc install one using --download-cmake > On Nov 7, 2022, at 12:48 PM, Hom Nath Gharti wrote: > > Dear all, > > I am trying to compile the latest version, but I am getting the following error: > ******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > ------------------------------------------------------------------------------- > Error configuring METIS with CMake > ******************************************************************************* > > When I remove this package from the configure option, I get the same error for other packages. Is there something wrong with my configure command or compilers? > > Attached is the configure log file. > > Best, > Hom Nath > From alexlindsay239 at gmail.com Mon Nov 7 12:55:10 2022 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Mon, 7 Nov 2022 10:55:10 -0800 Subject: [petsc-users] Local columns of A10 do not equal local rows of A00 In-Reply-To: References: Message-ID: I'm not sure exactly what you mean, but I'll try to give more details. We have our own DM class (DM_Moose) and we set our own field and domain decomposition routines: dm->ops->createfielddecomposition = DMCreateFieldDecomposition_Moose; dm->ops->createdomaindecomposition = DMCreateDomainDecomposition_Moose; The field and domain decomposition routines are as follows (can see also at https://github.com/idaholab/moose/blob/next/framework/src/utils/PetscDMMoose.C ): static PetscErrorCode DMCreateFieldDecomposition_Moose( DM dm, PetscInt * len, char *** namelist, IS ** islist, DM ** dmlist) { PetscErrorCode ierr; DM_Moose * dmm = (DM_Moose *)(dm->data); PetscFunctionBegin; /* Only called after DMSetUp(). */ if (!dmm->_splitlocs) PetscFunctionReturn(0); *len = dmm->_splitlocs->size(); if (namelist) { ierr = PetscMalloc(*len * sizeof(char *), namelist); CHKERRQ(ierr); } if (islist) { ierr = PetscMalloc(*len * sizeof(IS), islist); CHKERRQ(ierr); } if (dmlist) { ierr = PetscMalloc(*len * sizeof(DM), dmlist); CHKERRQ(ierr); } for (const auto & dit : *(dmm->_splitlocs)) { unsigned int d = dit.second; std::string dname = dit.first; DM_Moose::SplitInfo & dinfo = (*dmm->_splits)[dname]; if (!dinfo._dm) { ierr = DMCreateMoose(((PetscObject)dm)->comm, *dmm->_nl, &dinfo._dm); CHKERRQ(ierr); ierr = PetscObjectSetOptionsPrefix((PetscObject)dinfo._dm, ((PetscObject)dm)->prefix); CHKERRQ(ierr); std::string suffix = std::string("fieldsplit_") + dname + "_"; ierr = PetscObjectAppendOptionsPrefix((PetscObject)dinfo._dm, suffix.c_str()); CHKERRQ(ierr); } ierr = DMSetFromOptions(dinfo._dm); CHKERRQ(ierr); ierr = DMSetUp(dinfo._dm); CHKERRQ(ierr); if (namelist) { ierr = PetscStrallocpy(dname.c_str(), (*namelist) + d); CHKERRQ(ierr); } if (islist) { if (!dinfo._rembedding) { IS dembedding, lembedding; ierr = DMMooseGetEmbedding_Private(dinfo._dm, &dembedding); CHKERRQ(ierr); if (dmm->_embedding) { // Create a relative embedding into the parent's index space. ierr = ISEmbed(dembedding, dmm->_embedding, PETSC_TRUE, &lembedding); CHKERRQ(ierr); const PetscInt * lindices; PetscInt len, dlen, llen, *rindices, off, i; ierr = ISGetLocalSize(dembedding, &dlen); CHKERRQ(ierr); ierr = ISGetLocalSize(lembedding, &llen); CHKERRQ(ierr); if (llen != dlen) SETERRQ1(((PetscObject)dm)->comm, PETSC_ERR_PLIB, "Failed to embed split %D", d); ierr = ISDestroy(&dembedding); CHKERRQ(ierr); // Convert local embedding to global (but still relative) embedding ierr = PetscMalloc(llen * sizeof(PetscInt), &rindices); CHKERRQ(ierr); ierr = ISGetIndices(lembedding, &lindices); CHKERRQ(ierr); ierr = PetscMemcpy(rindices, lindices, llen * sizeof(PetscInt)); CHKERRQ(ierr); ierr = ISDestroy(&lembedding); CHKERRQ(ierr); // We could get the index offset from a corresponding global vector, but subDMs don't yet // have global vectors ierr = ISGetLocalSize(dmm->_embedding, &len); CHKERRQ(ierr); ierr = MPI_Scan(&len, &off, 1, #ifdef PETSC_USE_64BIT_INDICES MPI_LONG_LONG_INT, #else MPI_INT, #endif MPI_SUM, ((PetscObject)dm)->comm); CHKERRQ(ierr); off -= len; for (i = 0; i < llen; ++i) rindices[i] += off; ierr = ISCreateGeneral( ((PetscObject)dm)->comm, llen, rindices, PETSC_OWN_POINTER, &(dinfo._rembedding)); CHKERRQ(ierr); } else { dinfo._rembedding = dembedding; } } ierr = PetscObjectReference((PetscObject)(dinfo._rembedding)); CHKERRQ(ierr); (*islist)[d] = dinfo._rembedding; } if (dmlist) { ierr = PetscObjectReference((PetscObject)dinfo._dm); CHKERRQ(ierr); (*dmlist)[d] = dinfo._dm; } } PetscFunctionReturn(0); } static PetscErrorCode DMCreateDomainDecomposition_Moose( DM dm, PetscInt * len, char *** namelist, IS ** innerislist, IS ** outerislist, DM ** dmlist) { PetscErrorCode ierr; PetscFunctionBegin; /* Use DMCreateFieldDecomposition_Moose() to obtain everything but outerislist, which is currently * PETSC_NULL. */ if (outerislist) *outerislist = PETSC_NULL; /* FIX: allow mesh-based overlap. */ ierr = DMCreateFieldDecomposition_Moose(dm, len, namelist, innerislist, dmlist); CHKERRQ(ierr); PetscFunctionReturn(0); } On Thu, Nov 3, 2022 at 5:19 PM Matthew Knepley wrote: > On Thu, Nov 3, 2022 at 7:52 PM Alexander Lindsay > wrote: > >> I have errors on quite a few (but not all) processes of the like >> >> [1]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> [1]PETSC ERROR: Nonconforming object sizes >> [1]PETSC ERROR: Local columns of A10 4137 do not equal local rows of A00 >> 4129 >> >> when performing field splits. We (MOOSE) have some code for identifying >> the index sets for each split. However, the code was written by some >> authors who are no longer with us. Normally I would chase this down in a >> debugger, but this error only seems to crop up for pretty complex and large >> meshes. If anyone has an idea for what we might be doing wrong, that might >> help me chase this down faster. I guess intuitively I'm pretty perplexed >> that we could get ourselves into this pickle as it almost appears that we >> have two different local dof index counts for a given block (0 in this >> case). More background, if helpful, can be found in >> https://github.com/idaholab/moose/issues/22359 as well as >> https://github.com/idaholab/moose/discussions/22468. >> > > How are you specifying the blocks? I would not have thought this was > possible. > > Thanks, > > Matt > > >> I should note that we are currently running with 3.16.6 as our PETSc >> submodule hash (we are talking about updating to 3.18 soon). >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Mon Nov 7 13:09:34 2022 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Mon, 7 Nov 2022 11:09:34 -0800 Subject: [petsc-users] Local columns of A10 do not equal local rows of A00 In-Reply-To: References: Message-ID: The libMesh/MOOSE specific code that identifies dof indices for ISCreateGeneral is in DMooseGetEmbedding_Private. I can share that function (it's quite long) or more details if that could be helpful. On Mon, Nov 7, 2022 at 10:55 AM Alexander Lindsay wrote: > I'm not sure exactly what you mean, but I'll try to give more details. We > have our own DM class (DM_Moose) and we set our own field and domain > decomposition routines: > > dm->ops->createfielddecomposition = DMCreateFieldDecomposition_Moose; > > dm->ops->createdomaindecomposition = DMCreateDomainDecomposition_Moose; > > > The field and domain decomposition routines are as follows (can see also > at > https://github.com/idaholab/moose/blob/next/framework/src/utils/PetscDMMoose.C > ): > > static PetscErrorCode > DMCreateFieldDecomposition_Moose( > DM dm, PetscInt * len, char *** namelist, IS ** islist, DM ** dmlist) > { > PetscErrorCode ierr; > DM_Moose * dmm = (DM_Moose *)(dm->data); > > PetscFunctionBegin; > /* Only called after DMSetUp(). */ > if (!dmm->_splitlocs) > PetscFunctionReturn(0); > *len = dmm->_splitlocs->size(); > if (namelist) > { > ierr = PetscMalloc(*len * sizeof(char *), namelist); > CHKERRQ(ierr); > } > if (islist) > { > ierr = PetscMalloc(*len * sizeof(IS), islist); > CHKERRQ(ierr); > } > if (dmlist) > { > ierr = PetscMalloc(*len * sizeof(DM), dmlist); > CHKERRQ(ierr); > } > for (const auto & dit : *(dmm->_splitlocs)) > { > unsigned int d = dit.second; > std::string dname = dit.first; > DM_Moose::SplitInfo & dinfo = (*dmm->_splits)[dname]; > if (!dinfo._dm) > { > ierr = DMCreateMoose(((PetscObject)dm)->comm, *dmm->_nl, &dinfo._dm); > CHKERRQ(ierr); > ierr = PetscObjectSetOptionsPrefix((PetscObject)dinfo._dm, > ((PetscObject)dm)->prefix); > CHKERRQ(ierr); > std::string suffix = std::string("fieldsplit_") + dname + "_"; > ierr = PetscObjectAppendOptionsPrefix((PetscObject)dinfo._dm, > suffix.c_str()); > CHKERRQ(ierr); > } > ierr = DMSetFromOptions(dinfo._dm); > CHKERRQ(ierr); > ierr = DMSetUp(dinfo._dm); > CHKERRQ(ierr); > if (namelist) > { > ierr = PetscStrallocpy(dname.c_str(), (*namelist) + d); > CHKERRQ(ierr); > } > if (islist) > { > if (!dinfo._rembedding) > { > IS dembedding, lembedding; > ierr = DMMooseGetEmbedding_Private(dinfo._dm, &dembedding); > CHKERRQ(ierr); > if (dmm->_embedding) > { > // Create a relative embedding into the parent's index space. > ierr = ISEmbed(dembedding, dmm->_embedding, PETSC_TRUE, > &lembedding); > CHKERRQ(ierr); > const PetscInt * lindices; > PetscInt len, dlen, llen, *rindices, off, i; > ierr = ISGetLocalSize(dembedding, &dlen); > CHKERRQ(ierr); > ierr = ISGetLocalSize(lembedding, &llen); > CHKERRQ(ierr); > if (llen != dlen) > SETERRQ1(((PetscObject)dm)->comm, PETSC_ERR_PLIB, "Failed to > embed split %D", d); > ierr = ISDestroy(&dembedding); > CHKERRQ(ierr); > // Convert local embedding to global (but still relative) > embedding > ierr = PetscMalloc(llen * sizeof(PetscInt), &rindices); > CHKERRQ(ierr); > ierr = ISGetIndices(lembedding, &lindices); > CHKERRQ(ierr); > ierr = PetscMemcpy(rindices, lindices, llen * sizeof(PetscInt)); > CHKERRQ(ierr); > ierr = ISDestroy(&lembedding); > CHKERRQ(ierr); > // We could get the index offset from a corresponding global > vector, but subDMs don't yet > // have global vectors > ierr = ISGetLocalSize(dmm->_embedding, &len); > CHKERRQ(ierr); > > ierr = MPI_Scan(&len, > &off, > 1, > #ifdef PETSC_USE_64BIT_INDICES > MPI_LONG_LONG_INT, > #else > MPI_INT, > #endif > MPI_SUM, > ((PetscObject)dm)->comm); > CHKERRQ(ierr); > > off -= len; > for (i = 0; i < llen; ++i) > rindices[i] += off; > ierr = ISCreateGeneral( > ((PetscObject)dm)->comm, llen, rindices, PETSC_OWN_POINTER, > &(dinfo._rembedding)); > CHKERRQ(ierr); > } > else > { > dinfo._rembedding = dembedding; > } > } > ierr = PetscObjectReference((PetscObject)(dinfo._rembedding)); > CHKERRQ(ierr); > (*islist)[d] = dinfo._rembedding; > } > if (dmlist) > { > ierr = PetscObjectReference((PetscObject)dinfo._dm); > CHKERRQ(ierr); > (*dmlist)[d] = dinfo._dm; > } > } > PetscFunctionReturn(0); > } > > static PetscErrorCode > DMCreateDomainDecomposition_Moose( > DM dm, PetscInt * len, char *** namelist, IS ** innerislist, IS ** > outerislist, DM ** dmlist) > { > PetscErrorCode ierr; > > PetscFunctionBegin; > /* Use DMCreateFieldDecomposition_Moose() to obtain everything but > outerislist, which is currently > * PETSC_NULL. */ > if (outerislist) > *outerislist = PETSC_NULL; /* FIX: allow mesh-based overlap. */ > ierr = DMCreateFieldDecomposition_Moose(dm, len, namelist, innerislist, > dmlist); > CHKERRQ(ierr); > PetscFunctionReturn(0); > } > > > > On Thu, Nov 3, 2022 at 5:19 PM Matthew Knepley wrote: > >> On Thu, Nov 3, 2022 at 7:52 PM Alexander Lindsay < >> alexlindsay239 at gmail.com> wrote: >> >>> I have errors on quite a few (but not all) processes of the like >>> >>> [1]PETSC ERROR: --------------------- Error Message >>> -------------------------------------------------------------- >>> [1]PETSC ERROR: Nonconforming object sizes >>> [1]PETSC ERROR: Local columns of A10 4137 do not equal local rows of A00 >>> 4129 >>> >>> when performing field splits. We (MOOSE) have some code for identifying >>> the index sets for each split. However, the code was written by some >>> authors who are no longer with us. Normally I would chase this down in a >>> debugger, but this error only seems to crop up for pretty complex and large >>> meshes. If anyone has an idea for what we might be doing wrong, that might >>> help me chase this down faster. I guess intuitively I'm pretty perplexed >>> that we could get ourselves into this pickle as it almost appears that we >>> have two different local dof index counts for a given block (0 in this >>> case). More background, if helpful, can be found in >>> https://github.com/idaholab/moose/issues/22359 as well as >>> https://github.com/idaholab/moose/discussions/22468. >>> >> >> How are you specifying the blocks? I would not have thought this was >> possible. >> >> Thanks, >> >> Matt >> >> >>> I should note that we are currently running with 3.16.6 as our PETSc >>> submodule hash (we are talking about updating to 3.18 soon). >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hng.email at gmail.com Mon Nov 7 13:38:00 2022 From: hng.email at gmail.com (Hom Nath Gharti) Date: Mon, 7 Nov 2022 14:38:00 -0500 Subject: [petsc-users] Error configuring external packages In-Reply-To: <792FF04D-3C01-42D5-8C7E-CE88A1869211@petsc.dev> References: <792FF04D-3C01-42D5-8C7E-CE88A1869211@petsc.dev> Message-ID: Indeed! Thank you so much, Barry! Sorry for my oversight. Best, Hom Nath On Mon, Nov 7, 2022 at 1:19 PM Barry Smith wrote: > > The cmake in your path is broken > > TESTING: locateCMake from > config.packages.cmake(/gpfs/fs1/home/h/hngharti/hngharti/lsoft/petsc-gnu/config/BuildSystem/config/packages/cmake.py:53) > Looking for default CMake executable > Checking for program > /scinet/niagara/software/2019b/opt/gcc-9.4.0/openmpi/4.1.1/bin/cmake...not > found > Checking for program > /scinet/niagara/software/2019b/opt/base/gcc/9.4.0/bin/cmake...not found > Checking for program > /home/h/hngharti/hngharti/lsoft/cmake-3.23.1/bin/cmake...found > Defined make macro "CMAKE" to > "/home/h/hngharti/hngharti/lsoft/cmake-3.23.1/bin/cmake" > Looking for default CTest executable > Checking for program > /scinet/niagara/software/2019b/opt/gcc-9.4.0/openmpi/4.1.1/bin/ctest...not > found > Checking for program > /scinet/niagara/software/2019b/opt/base/gcc/9.4.0/bin/ctest...not found > Checking for program > /home/h/hngharti/hngharti/lsoft/cmake-3.23.1/bin/ctest...found > Defined make macro "CTEST" to > "/home/h/hngharti/hngharti/lsoft/cmake-3.23.1/bin/ctest" > Executing: /home/h/hngharti/hngharti/lsoft/cmake-3.23.1/bin/cmake --version > cmake --version failed: Could not execute > "['/home/h/hngharti/hngharti/lsoft/cmake-3.23.1/bin/cmake --version']": > /home/h/hngharti/hngharti/lsoft/cmake-3.23.1/bin/cmake: error while > loading shared libraries: libfabric.so.1: cannot open shared object file: > No such file or directory > > you need to either install a better cmake or have PETSc install one using > --download-cmake > > > > > On Nov 7, 2022, at 12:48 PM, Hom Nath Gharti > wrote: > > > > Dear all, > > > > I am trying to compile the latest version, but I am getting the > following error: > > > ******************************************************************************* > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log > for details): > > > ------------------------------------------------------------------------------- > > Error configuring METIS with CMake > > > ******************************************************************************* > > > > When I remove this package from the configure option, I get the same > error for other packages. Is there something wrong with my configure > command or compilers? > > > > Attached is the configure log file. > > > > Best, > > Hom Nath > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Nov 7 14:30:26 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 7 Nov 2022 15:30:26 -0500 Subject: [petsc-users] Manually setting PetscSF In-Reply-To: References: Message-ID: On Mon, Nov 7, 2022 at 1:15 PM Nicholas Arnold-Medabalimi < narnoldm at umich.edu> wrote: > Hi Petsc Users, > > > I am working on setting up the star forest to connect my meshes on > different processors. > > For a 2-processor example, I have set up a dmplex object and read in the > coordinates nodes and cell2node charts on each one. But right now they are > independent effectively. > > For simplicity lets just say each process has 4 cells. > > [image: image.png] > > I'm trying to build the star forest that sets the edge between the two > partitions. (see fig) > > To my understanding, I do this using PetscSFSetGraph. I choose either 9-10 > on rank 0 or 1-2 on rank 1 to be the "roots". For simplicity lets say the > 1-2 on rank 1 are roots. > > (also for the rest of the discussion I'm switching 1 and 2 to be 0 and 1 > and 9 and 10 to be 8 9 ) (the figure is 1 indexed but for the code below it > will be 0) > > So on rank 0 I would set > > nleaves=2 > nroots= 8+c1 (graph is only cells and verts so its 8 verts plus the cell > count(c1) > > remote[0].rank=1 > remote[0].index=0+(cell count on rank1) > leaf[0]=8+c1 > > remote[1].rank=1 > remote[1].index=0+(cell count on rank1) > I think this is wrong, instead you want remote[1].index=1+(cell count on rank1) > leaf[1]=9+c1 > > > and on rank 1 we would set > > nroots= ncells+nverts > nleaves=0 > > Since its all roots I don't think I need to set anything else? > > > > > PetscSFSetGraph(sf,nroots,nleaves,leaves,PETSC_COPY_VALUES,remote,PETSC_COPY_VALUES) > > I am certain I am making a mistake somewhere since I get an error when I > then call PetscSFSetup. I am working in C++ right now but this is just > testing out before implementation in fortran which is why I am using > PETSC_COPY. > You could also probably use https://petsc.org/main/docs/manualpages/DMPlex/DMPlexCreateFromCellListParallelPetsc/ Thanks, Matt > Any help and clarification would be appreciated. > > > Sincerely > Nicholas > > > > -- > Nicholas Arnold-Medabalimi > > Ph.D. Candidate > Computational Aeroscience Lab > University of Michigan > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 18794 bytes Desc: not available URL: From knepley at gmail.com Mon Nov 7 14:32:54 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 7 Nov 2022 15:32:54 -0500 Subject: [petsc-users] Local columns of A10 do not equal local rows of A00 In-Reply-To: References: Message-ID: On Mon, Nov 7, 2022 at 2:09 PM Alexander Lindsay wrote: > The libMesh/MOOSE specific code that identifies dof indices for > ISCreateGeneral is in DMooseGetEmbedding_Private. I can share that function > (it's quite long) or more details if that could be helpful. > Sorry, I should have written more. The puzzling thing for me is that somehow it looks like the row and column index sets are not the same. I did not think PCFIELDSPLIT could do that. The PCFieldSplitSetIS() interface does not allow it. I was wondering how you were setting the ISes. Thanks, Matt > On Mon, Nov 7, 2022 at 10:55 AM Alexander Lindsay < > alexlindsay239 at gmail.com> wrote: > >> I'm not sure exactly what you mean, but I'll try to give more details. We >> have our own DM class (DM_Moose) and we set our own field and domain >> decomposition routines: >> >> dm->ops->createfielddecomposition = DMCreateFieldDecomposition_Moose; >> >> dm->ops->createdomaindecomposition = DMCreateDomainDecomposition_Moose; >> >> >> The field and domain decomposition routines are as follows (can see also >> at >> https://github.com/idaholab/moose/blob/next/framework/src/utils/PetscDMMoose.C >> ): >> >> static PetscErrorCode >> DMCreateFieldDecomposition_Moose( >> DM dm, PetscInt * len, char *** namelist, IS ** islist, DM ** dmlist) >> { >> PetscErrorCode ierr; >> DM_Moose * dmm = (DM_Moose *)(dm->data); >> >> PetscFunctionBegin; >> /* Only called after DMSetUp(). */ >> if (!dmm->_splitlocs) >> PetscFunctionReturn(0); >> *len = dmm->_splitlocs->size(); >> if (namelist) >> { >> ierr = PetscMalloc(*len * sizeof(char *), namelist); >> CHKERRQ(ierr); >> } >> if (islist) >> { >> ierr = PetscMalloc(*len * sizeof(IS), islist); >> CHKERRQ(ierr); >> } >> if (dmlist) >> { >> ierr = PetscMalloc(*len * sizeof(DM), dmlist); >> CHKERRQ(ierr); >> } >> for (const auto & dit : *(dmm->_splitlocs)) >> { >> unsigned int d = dit.second; >> std::string dname = dit.first; >> DM_Moose::SplitInfo & dinfo = (*dmm->_splits)[dname]; >> if (!dinfo._dm) >> { >> ierr = DMCreateMoose(((PetscObject)dm)->comm, *dmm->_nl, >> &dinfo._dm); >> CHKERRQ(ierr); >> ierr = PetscObjectSetOptionsPrefix((PetscObject)dinfo._dm, >> ((PetscObject)dm)->prefix); >> CHKERRQ(ierr); >> std::string suffix = std::string("fieldsplit_") + dname + "_"; >> ierr = PetscObjectAppendOptionsPrefix((PetscObject)dinfo._dm, >> suffix.c_str()); >> CHKERRQ(ierr); >> } >> ierr = DMSetFromOptions(dinfo._dm); >> CHKERRQ(ierr); >> ierr = DMSetUp(dinfo._dm); >> CHKERRQ(ierr); >> if (namelist) >> { >> ierr = PetscStrallocpy(dname.c_str(), (*namelist) + d); >> CHKERRQ(ierr); >> } >> if (islist) >> { >> if (!dinfo._rembedding) >> { >> IS dembedding, lembedding; >> ierr = DMMooseGetEmbedding_Private(dinfo._dm, &dembedding); >> CHKERRQ(ierr); >> if (dmm->_embedding) >> { >> // Create a relative embedding into the parent's index space. >> ierr = ISEmbed(dembedding, dmm->_embedding, PETSC_TRUE, >> &lembedding); >> CHKERRQ(ierr); >> const PetscInt * lindices; >> PetscInt len, dlen, llen, *rindices, off, i; >> ierr = ISGetLocalSize(dembedding, &dlen); >> CHKERRQ(ierr); >> ierr = ISGetLocalSize(lembedding, &llen); >> CHKERRQ(ierr); >> if (llen != dlen) >> SETERRQ1(((PetscObject)dm)->comm, PETSC_ERR_PLIB, "Failed to >> embed split %D", d); >> ierr = ISDestroy(&dembedding); >> CHKERRQ(ierr); >> // Convert local embedding to global (but still relative) >> embedding >> ierr = PetscMalloc(llen * sizeof(PetscInt), &rindices); >> CHKERRQ(ierr); >> ierr = ISGetIndices(lembedding, &lindices); >> CHKERRQ(ierr); >> ierr = PetscMemcpy(rindices, lindices, llen * sizeof(PetscInt)); >> CHKERRQ(ierr); >> ierr = ISDestroy(&lembedding); >> CHKERRQ(ierr); >> // We could get the index offset from a corresponding global >> vector, but subDMs don't yet >> // have global vectors >> ierr = ISGetLocalSize(dmm->_embedding, &len); >> CHKERRQ(ierr); >> >> ierr = MPI_Scan(&len, >> &off, >> 1, >> #ifdef PETSC_USE_64BIT_INDICES >> MPI_LONG_LONG_INT, >> #else >> MPI_INT, >> #endif >> MPI_SUM, >> ((PetscObject)dm)->comm); >> CHKERRQ(ierr); >> >> off -= len; >> for (i = 0; i < llen; ++i) >> rindices[i] += off; >> ierr = ISCreateGeneral( >> ((PetscObject)dm)->comm, llen, rindices, PETSC_OWN_POINTER, >> &(dinfo._rembedding)); >> CHKERRQ(ierr); >> } >> else >> { >> dinfo._rembedding = dembedding; >> } >> } >> ierr = PetscObjectReference((PetscObject)(dinfo._rembedding)); >> CHKERRQ(ierr); >> (*islist)[d] = dinfo._rembedding; >> } >> if (dmlist) >> { >> ierr = PetscObjectReference((PetscObject)dinfo._dm); >> CHKERRQ(ierr); >> (*dmlist)[d] = dinfo._dm; >> } >> } >> PetscFunctionReturn(0); >> } >> >> static PetscErrorCode >> DMCreateDomainDecomposition_Moose( >> DM dm, PetscInt * len, char *** namelist, IS ** innerislist, IS ** >> outerislist, DM ** dmlist) >> { >> PetscErrorCode ierr; >> >> PetscFunctionBegin; >> /* Use DMCreateFieldDecomposition_Moose() to obtain everything but >> outerislist, which is currently >> * PETSC_NULL. */ >> if (outerislist) >> *outerislist = PETSC_NULL; /* FIX: allow mesh-based overlap. */ >> ierr = DMCreateFieldDecomposition_Moose(dm, len, namelist, innerislist, >> dmlist); >> CHKERRQ(ierr); >> PetscFunctionReturn(0); >> } >> >> >> >> On Thu, Nov 3, 2022 at 5:19 PM Matthew Knepley wrote: >> >>> On Thu, Nov 3, 2022 at 7:52 PM Alexander Lindsay < >>> alexlindsay239 at gmail.com> wrote: >>> >>>> I have errors on quite a few (but not all) processes of the like >>>> >>>> [1]PETSC ERROR: --------------------- Error Message >>>> -------------------------------------------------------------- >>>> [1]PETSC ERROR: Nonconforming object sizes >>>> [1]PETSC ERROR: Local columns of A10 4137 do not equal local rows of >>>> A00 4129 >>>> >>>> when performing field splits. We (MOOSE) have some code for identifying >>>> the index sets for each split. However, the code was written by some >>>> authors who are no longer with us. Normally I would chase this down in a >>>> debugger, but this error only seems to crop up for pretty complex and large >>>> meshes. If anyone has an idea for what we might be doing wrong, that might >>>> help me chase this down faster. I guess intuitively I'm pretty perplexed >>>> that we could get ourselves into this pickle as it almost appears that we >>>> have two different local dof index counts for a given block (0 in this >>>> case). More background, if helpful, can be found in >>>> https://github.com/idaholab/moose/issues/22359 as well as >>>> https://github.com/idaholab/moose/discussions/22468. >>>> >>> >>> How are you specifying the blocks? I would not have thought this was >>> possible. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> I should note that we are currently running with 3.16.6 as our PETSc >>>> submodule hash (we are talking about updating to 3.18 soon). >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Mon Nov 7 16:47:59 2022 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Mon, 7 Nov 2022 14:47:59 -0800 Subject: [petsc-users] Local columns of A10 do not equal local rows of A00 In-Reply-To: References: Message-ID: My understanding looking at PCFieldSplitSetDefaults is that our implementation of `createfielddecomposition` should get called, we'll set `fields` and then (ignoring possible user setting of -pc_fieldsplit_%D_fields flag) PCFieldSplitSetIS will get called with whatever we did to `fields`. So yea I guess that just looking over that I would assume we're not supplying two different index sets for rows and columns, or put more precisely we (MOOSE) are not really afforded the opportunity to. But my interpretation could very well be wrong. On Mon, Nov 7, 2022 at 12:33 PM Matthew Knepley wrote: > On Mon, Nov 7, 2022 at 2:09 PM Alexander Lindsay > wrote: > >> The libMesh/MOOSE specific code that identifies dof indices for >> ISCreateGeneral is in DMooseGetEmbedding_Private. I can share that function >> (it's quite long) or more details if that could be helpful. >> > > Sorry, I should have written more. The puzzling thing for me is that > somehow it looks like the row and column index sets are not the same. I did > not think > PCFIELDSPLIT could do that. The PCFieldSplitSetIS() interface does not > allow it. I was wondering how you were setting the ISes. > > Thanks, > > Matt > > >> On Mon, Nov 7, 2022 at 10:55 AM Alexander Lindsay < >> alexlindsay239 at gmail.com> wrote: >> >>> I'm not sure exactly what you mean, but I'll try to give more details. >>> We have our own DM class (DM_Moose) and we set our own field and domain >>> decomposition routines: >>> >>> dm->ops->createfielddecomposition = DMCreateFieldDecomposition_Moose; >>> >>> dm->ops->createdomaindecomposition = >>> DMCreateDomainDecomposition_Moose; >>> >>> >>> The field and domain decomposition routines are as follows (can see also >>> at >>> https://github.com/idaholab/moose/blob/next/framework/src/utils/PetscDMMoose.C >>> ): >>> >>> static PetscErrorCode >>> DMCreateFieldDecomposition_Moose( >>> DM dm, PetscInt * len, char *** namelist, IS ** islist, DM ** dmlist) >>> { >>> PetscErrorCode ierr; >>> DM_Moose * dmm = (DM_Moose *)(dm->data); >>> >>> PetscFunctionBegin; >>> /* Only called after DMSetUp(). */ >>> if (!dmm->_splitlocs) >>> PetscFunctionReturn(0); >>> *len = dmm->_splitlocs->size(); >>> if (namelist) >>> { >>> ierr = PetscMalloc(*len * sizeof(char *), namelist); >>> CHKERRQ(ierr); >>> } >>> if (islist) >>> { >>> ierr = PetscMalloc(*len * sizeof(IS), islist); >>> CHKERRQ(ierr); >>> } >>> if (dmlist) >>> { >>> ierr = PetscMalloc(*len * sizeof(DM), dmlist); >>> CHKERRQ(ierr); >>> } >>> for (const auto & dit : *(dmm->_splitlocs)) >>> { >>> unsigned int d = dit.second; >>> std::string dname = dit.first; >>> DM_Moose::SplitInfo & dinfo = (*dmm->_splits)[dname]; >>> if (!dinfo._dm) >>> { >>> ierr = DMCreateMoose(((PetscObject)dm)->comm, *dmm->_nl, >>> &dinfo._dm); >>> CHKERRQ(ierr); >>> ierr = PetscObjectSetOptionsPrefix((PetscObject)dinfo._dm, >>> ((PetscObject)dm)->prefix); >>> CHKERRQ(ierr); >>> std::string suffix = std::string("fieldsplit_") + dname + "_"; >>> ierr = PetscObjectAppendOptionsPrefix((PetscObject)dinfo._dm, >>> suffix.c_str()); >>> CHKERRQ(ierr); >>> } >>> ierr = DMSetFromOptions(dinfo._dm); >>> CHKERRQ(ierr); >>> ierr = DMSetUp(dinfo._dm); >>> CHKERRQ(ierr); >>> if (namelist) >>> { >>> ierr = PetscStrallocpy(dname.c_str(), (*namelist) + d); >>> CHKERRQ(ierr); >>> } >>> if (islist) >>> { >>> if (!dinfo._rembedding) >>> { >>> IS dembedding, lembedding; >>> ierr = DMMooseGetEmbedding_Private(dinfo._dm, &dembedding); >>> CHKERRQ(ierr); >>> if (dmm->_embedding) >>> { >>> // Create a relative embedding into the parent's index space. >>> ierr = ISEmbed(dembedding, dmm->_embedding, PETSC_TRUE, >>> &lembedding); >>> CHKERRQ(ierr); >>> const PetscInt * lindices; >>> PetscInt len, dlen, llen, *rindices, off, i; >>> ierr = ISGetLocalSize(dembedding, &dlen); >>> CHKERRQ(ierr); >>> ierr = ISGetLocalSize(lembedding, &llen); >>> CHKERRQ(ierr); >>> if (llen != dlen) >>> SETERRQ1(((PetscObject)dm)->comm, PETSC_ERR_PLIB, "Failed to >>> embed split %D", d); >>> ierr = ISDestroy(&dembedding); >>> CHKERRQ(ierr); >>> // Convert local embedding to global (but still relative) >>> embedding >>> ierr = PetscMalloc(llen * sizeof(PetscInt), &rindices); >>> CHKERRQ(ierr); >>> ierr = ISGetIndices(lembedding, &lindices); >>> CHKERRQ(ierr); >>> ierr = PetscMemcpy(rindices, lindices, llen * >>> sizeof(PetscInt)); >>> CHKERRQ(ierr); >>> ierr = ISDestroy(&lembedding); >>> CHKERRQ(ierr); >>> // We could get the index offset from a corresponding global >>> vector, but subDMs don't yet >>> // have global vectors >>> ierr = ISGetLocalSize(dmm->_embedding, &len); >>> CHKERRQ(ierr); >>> >>> ierr = MPI_Scan(&len, >>> &off, >>> 1, >>> #ifdef PETSC_USE_64BIT_INDICES >>> MPI_LONG_LONG_INT, >>> #else >>> MPI_INT, >>> #endif >>> MPI_SUM, >>> ((PetscObject)dm)->comm); >>> CHKERRQ(ierr); >>> >>> off -= len; >>> for (i = 0; i < llen; ++i) >>> rindices[i] += off; >>> ierr = ISCreateGeneral( >>> ((PetscObject)dm)->comm, llen, rindices, >>> PETSC_OWN_POINTER, &(dinfo._rembedding)); >>> CHKERRQ(ierr); >>> } >>> else >>> { >>> dinfo._rembedding = dembedding; >>> } >>> } >>> ierr = PetscObjectReference((PetscObject)(dinfo._rembedding)); >>> CHKERRQ(ierr); >>> (*islist)[d] = dinfo._rembedding; >>> } >>> if (dmlist) >>> { >>> ierr = PetscObjectReference((PetscObject)dinfo._dm); >>> CHKERRQ(ierr); >>> (*dmlist)[d] = dinfo._dm; >>> } >>> } >>> PetscFunctionReturn(0); >>> } >>> >>> static PetscErrorCode >>> DMCreateDomainDecomposition_Moose( >>> DM dm, PetscInt * len, char *** namelist, IS ** innerislist, IS ** >>> outerislist, DM ** dmlist) >>> { >>> PetscErrorCode ierr; >>> >>> PetscFunctionBegin; >>> /* Use DMCreateFieldDecomposition_Moose() to obtain everything but >>> outerislist, which is currently >>> * PETSC_NULL. */ >>> if (outerislist) >>> *outerislist = PETSC_NULL; /* FIX: allow mesh-based overlap. */ >>> ierr = DMCreateFieldDecomposition_Moose(dm, len, namelist, >>> innerislist, dmlist); >>> CHKERRQ(ierr); >>> PetscFunctionReturn(0); >>> } >>> >>> >>> >>> On Thu, Nov 3, 2022 at 5:19 PM Matthew Knepley >>> wrote: >>> >>>> On Thu, Nov 3, 2022 at 7:52 PM Alexander Lindsay < >>>> alexlindsay239 at gmail.com> wrote: >>>> >>>>> I have errors on quite a few (but not all) processes of the like >>>>> >>>>> [1]PETSC ERROR: --------------------- Error Message >>>>> -------------------------------------------------------------- >>>>> [1]PETSC ERROR: Nonconforming object sizes >>>>> [1]PETSC ERROR: Local columns of A10 4137 do not equal local rows of >>>>> A00 4129 >>>>> >>>>> when performing field splits. We (MOOSE) have some code for >>>>> identifying the index sets for each split. However, the code was written by >>>>> some authors who are no longer with us. Normally I would chase this down in >>>>> a debugger, but this error only seems to crop up for pretty complex and >>>>> large meshes. If anyone has an idea for what we might be doing wrong, that >>>>> might help me chase this down faster. I guess intuitively I'm pretty >>>>> perplexed that we could get ourselves into this pickle as it almost appears >>>>> that we have two different local dof index counts for a given block (0 in >>>>> this case). More background, if helpful, can be found in >>>>> https://github.com/idaholab/moose/issues/22359 as well as >>>>> https://github.com/idaholab/moose/discussions/22468. >>>>> >>>> >>>> How are you specifying the blocks? I would not have thought this was >>>> possible. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> I should note that we are currently running with 3.16.6 as our PETSc >>>>> submodule hash (we are talking about updating to 3.18 soon). >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mhyaqteen at sju.ac.kr Mon Nov 7 18:06:09 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Tue, 8 Nov 2022 00:06:09 +0000 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: <992C4377-8755-4EEA-83BC-4C1606C18AE9@joliv.et> References: <2d1a6386-1c0c-2c52-4696-40efbd3b8a11@mcs.anl.gov> <992C4377-8755-4EEA-83BC-4C1606C18AE9@joliv.et> Message-ID: Although I followed the steps in the explanation you sent me, these are the errors that I got: [cid:image001.png at 01D8F351.568A14A0] From: Pierre Jolivet Sent: Monday, November 7, 2022 10:38 PM To: Mohammad Ali Yaqteen Cc: petsc-users Subject: Re: [petsc-users] PETSc Windows Installation You are not running under a MinGW x64 shell, but a MinGW UCRT (Universal C Runtime) x64 shell instead. This may work, but I?ve never tried it myself. Thanks, Pierre On 7 Nov 2022, at 2:29 PM, Mohammad Ali Yaqteen > wrote: Do I have to follow all the steps in the first link as it says the following in step 6: $ pacman -S mingw-w64-ucrt-x86_64-gcc resolving dependencies... looking for conflicting packages... Packages (15) mingw-w64-ucrt-x86_64-binutils-2.39-2 mingw-w64-ucrt-x86_64-crt-git-10.0.0.r68.g6eb571448-1 mingw-w64-ucrt-x86_64-gcc-libs-12.2.0-1 mingw-w64-ucrt-x86_64-gmp-6.2.1-3 mingw-w64-ucrt-x86_64-headers-git-10.0.0.r68.g6eb571448-1 mingw-w64-ucrt-x86_64-isl-0.25-1 mingw-w64-ucrt-x86_64-libiconv-1.17-1 mingw-w64-ucrt-x86_64-libwinpthread-git-10.0.0.r68.g6eb571448-1 mingw-w64-ucrt-x86_64-mpc-1.2.1-1 mingw-w64-ucrt-x86_64-mpfr-4.1.0.p13-1 mingw-w64-ucrt-x86_64-windows-default-manifest-6.4-4 mingw-w64-ucrt-x86_64-winpthreads-git-10.0.0.r68.g6eb571448-1 mingw-w64-ucrt-x86_64-zlib-1.2.12-1 mingw-w64-ucrt-x86_64-zstd-1.5.2-2 mingw-w64-ucrt-x86_64-gcc-12.2.0-1 Total Installed Size: 397.59 MiB :: Proceed with installation? [Y/n] [... downloading and installation continues ...] Thanks Ali From: Pierre Jolivet > Sent: Monday, November 7, 2022 10:06 PM To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: Re: [petsc-users] PETSc Windows Installation Please, keep the list in copy. You can get MSYS2 from https://www.msys2.org/ Then install the following packages: https://github.com/FreeFem/FreeFem-sources/tree/master/etc/jenkins/deployRelease#windows-system Also install MS-MPI: https://www.microsoft.com/en-us/download/details.aspx?id=100593 Configure and compile PETSc under a MSYS2 MinGW x64 shell. Compile your code, and copy the binary. Notice in my screenshot that there are two shells, the MinGW one for building PETSc. The Microsoft (native one) for launching the binary. Thanks, Pierre On 7 Nov 2022, at 1:53 PM, Mohammad Ali Yaqteen > wrote: Is there a guide for it? That would be very useful! Because I have been trying a lot of things but every now and then there is a little step that is either outdated or can?t run! Your help will be highly appreciated Thanks Ali From: Pierre Jolivet > Sent: Monday, November 7, 2022 9:50 PM To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: Re: [petsc-users] PETSc Windows Installation Or you can use MinGW, it?s not tricky, you don?t need to change any PETSc code, and you can ship .exe for either x86_64 (mingw-w64-x86_64-gcc) or ARM (mingw-w64-clang-aarch64-clang, without MPI). Thanks, Pierre On 7 Nov 2022, at 1:38 PM, hamid badi > wrote: You can try gcc/clang cross-compilers, it's a little but tricky, i had to change some petsc codes but it works fine. Le lun. 7 nov. 2022 ? 13:30, Matthew Knepley > a ?crit : On Mon, Nov 7, 2022 at 7:11 AM Mohammad Ali Yaqteen > wrote: Once I finish writing the code, the .exe file will not change. Can I make an .exe file using WSL2 and VScode? If you build in WSL2, it will link to system libraries. You would probably need to run in WSL2 after that. If you are planning on running on native Windows, you likely need to build there. Thanks, Matt Thanks, Ali From: Matthew Knepley > Sent: Monday, November 7, 2022 7:13 PM To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: Re: [petsc-users] PETSc Windows Installation On Mon, Nov 7, 2022 at 12:21 AM Mohammad Ali Yaqteen > wrote: I have written backend code for a software company. If WSL2 and VSCode(Linux) can be called through a command line and executed at the backend, then it will be great. But if I have to install WSL2 and other required things on every other PC that will run that software, then I think I will be at a disadvantage. What do you suggest? As long as you do not change the architecture and the compiler libraries are available, you can run the executable. Thanks, Matt Thank you Ali -----Original Message----- From: Satish Balay > Sent: Monday, November 7, 2022 12:00 AM To: Matthew Knepley > Cc: Mohammad Ali Yaqteen >; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] PETSc Windows Installation Likely the compilers are not setup correctly as per instructions. https://petsc.org/release/install/windows/#installation-with-microsoft-intel-windows-compilers And if you do not have a specific windows need - and only need IDE - perhaps a WSL2 (aka linux) install with VSCode(linux) might be the way to go. Satish On Sun, 6 Nov 2022, Matthew Knepley wrote: > We need to see configure.log to see what is going on. Can you send it? > > Thanks, > > Matt > > On Sun, Nov 6, 2022 at 4:29 AM Mohammad Ali Yaqteen > > > wrote: > > > Dear Sir/Madam, > > > > > > > > I am installing PETSc on windows but it keeps giving me unexpected errors. > > I want to use it on MS Visual Studio or Codeblocks. When I use the > > command on your webpage (./configure --with-cc='win32fe cl' > > --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-mpi=0 > > --download-fblaslapack), I get the following error message: > > > > > > > > $ ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' > > --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack > > > > > > ==================================================================== > > ========================= > > > > Configuring PETSc to compile on your system > > > > > > ==================================================================== > > ========================= > > > > TESTING: checkCCompiler from > > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)* > > ******************************************************************** > > ********** > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > details): > > > > > > -------------------------------------------------------------------- > > ----------- > > > > C compiler you provided with -with-cc=win32fe cl cannot be found or > > does not work. > > > > Cannot compile/link C with > > /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/win32fe cl. > > > > > > > > Kindly look into this problem! Your prompt response will highly be > > appreciated > > > > > > > > Thank you > > > > Ali > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.png Type: image/png Size: 14305 bytes Desc: image001.png URL: From knepley at gmail.com Mon Nov 7 20:04:11 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 7 Nov 2022 21:04:11 -0500 Subject: [petsc-users] Local columns of A10 do not equal local rows of A00 In-Reply-To: References: Message-ID: On Mon, Nov 7, 2022 at 5:48 PM Alexander Lindsay wrote: > My understanding looking at PCFieldSplitSetDefaults is that our > implementation of `createfielddecomposition` should get called, we'll set > `fields` and then (ignoring possible user setting of > -pc_fieldsplit_%D_fields flag) PCFieldSplitSetIS will get called with > whatever we did to `fields`. So yea I guess that just looking over that I > would assume we're not supplying two different index sets for rows and > columns, or put more precisely we (MOOSE) are not really afforded the > opportunity to. But my interpretation could very well be wrong. > Oh wait. I read the error message again. It does not say that the whole selection is rectangular. It says Local columns of A10 4137 do not equal local rows of A00 4129 So this is a parallel partitioning thing. Since A00 has 4129 local rows, it should have this many columns as well. However A10 has 4137 local columns. How big is IS_0, on each process, that you pass in to PCFIELDSPLIT? Thanks, Matt > On Mon, Nov 7, 2022 at 12:33 PM Matthew Knepley wrote: > >> On Mon, Nov 7, 2022 at 2:09 PM Alexander Lindsay < >> alexlindsay239 at gmail.com> wrote: >> >>> The libMesh/MOOSE specific code that identifies dof indices for >>> ISCreateGeneral is in DMooseGetEmbedding_Private. I can share that function >>> (it's quite long) or more details if that could be helpful. >>> >> >> Sorry, I should have written more. The puzzling thing for me is that >> somehow it looks like the row and column index sets are not the same. I did >> not think >> PCFIELDSPLIT could do that. The PCFieldSplitSetIS() interface does not >> allow it. I was wondering how you were setting the ISes. >> >> Thanks, >> >> Matt >> >> >>> On Mon, Nov 7, 2022 at 10:55 AM Alexander Lindsay < >>> alexlindsay239 at gmail.com> wrote: >>> >>>> I'm not sure exactly what you mean, but I'll try to give more details. >>>> We have our own DM class (DM_Moose) and we set our own field and domain >>>> decomposition routines: >>>> >>>> dm->ops->createfielddecomposition = DMCreateFieldDecomposition_Moose; >>>> >>>> dm->ops->createdomaindecomposition = >>>> DMCreateDomainDecomposition_Moose; >>>> >>>> >>>> The field and domain decomposition routines are as follows (can see >>>> also at >>>> https://github.com/idaholab/moose/blob/next/framework/src/utils/PetscDMMoose.C >>>> ): >>>> >>>> static PetscErrorCode >>>> DMCreateFieldDecomposition_Moose( >>>> DM dm, PetscInt * len, char *** namelist, IS ** islist, DM ** >>>> dmlist) >>>> { >>>> PetscErrorCode ierr; >>>> DM_Moose * dmm = (DM_Moose *)(dm->data); >>>> >>>> PetscFunctionBegin; >>>> /* Only called after DMSetUp(). */ >>>> if (!dmm->_splitlocs) >>>> PetscFunctionReturn(0); >>>> *len = dmm->_splitlocs->size(); >>>> if (namelist) >>>> { >>>> ierr = PetscMalloc(*len * sizeof(char *), namelist); >>>> CHKERRQ(ierr); >>>> } >>>> if (islist) >>>> { >>>> ierr = PetscMalloc(*len * sizeof(IS), islist); >>>> CHKERRQ(ierr); >>>> } >>>> if (dmlist) >>>> { >>>> ierr = PetscMalloc(*len * sizeof(DM), dmlist); >>>> CHKERRQ(ierr); >>>> } >>>> for (const auto & dit : *(dmm->_splitlocs)) >>>> { >>>> unsigned int d = dit.second; >>>> std::string dname = dit.first; >>>> DM_Moose::SplitInfo & dinfo = (*dmm->_splits)[dname]; >>>> if (!dinfo._dm) >>>> { >>>> ierr = DMCreateMoose(((PetscObject)dm)->comm, *dmm->_nl, >>>> &dinfo._dm); >>>> CHKERRQ(ierr); >>>> ierr = PetscObjectSetOptionsPrefix((PetscObject)dinfo._dm, >>>> ((PetscObject)dm)->prefix); >>>> CHKERRQ(ierr); >>>> std::string suffix = std::string("fieldsplit_") + dname + "_"; >>>> ierr = PetscObjectAppendOptionsPrefix((PetscObject)dinfo._dm, >>>> suffix.c_str()); >>>> CHKERRQ(ierr); >>>> } >>>> ierr = DMSetFromOptions(dinfo._dm); >>>> CHKERRQ(ierr); >>>> ierr = DMSetUp(dinfo._dm); >>>> CHKERRQ(ierr); >>>> if (namelist) >>>> { >>>> ierr = PetscStrallocpy(dname.c_str(), (*namelist) + d); >>>> CHKERRQ(ierr); >>>> } >>>> if (islist) >>>> { >>>> if (!dinfo._rembedding) >>>> { >>>> IS dembedding, lembedding; >>>> ierr = DMMooseGetEmbedding_Private(dinfo._dm, &dembedding); >>>> CHKERRQ(ierr); >>>> if (dmm->_embedding) >>>> { >>>> // Create a relative embedding into the parent's index space. >>>> ierr = ISEmbed(dembedding, dmm->_embedding, PETSC_TRUE, >>>> &lembedding); >>>> CHKERRQ(ierr); >>>> const PetscInt * lindices; >>>> PetscInt len, dlen, llen, *rindices, off, i; >>>> ierr = ISGetLocalSize(dembedding, &dlen); >>>> CHKERRQ(ierr); >>>> ierr = ISGetLocalSize(lembedding, &llen); >>>> CHKERRQ(ierr); >>>> if (llen != dlen) >>>> SETERRQ1(((PetscObject)dm)->comm, PETSC_ERR_PLIB, "Failed >>>> to embed split %D", d); >>>> ierr = ISDestroy(&dembedding); >>>> CHKERRQ(ierr); >>>> // Convert local embedding to global (but still relative) >>>> embedding >>>> ierr = PetscMalloc(llen * sizeof(PetscInt), &rindices); >>>> CHKERRQ(ierr); >>>> ierr = ISGetIndices(lembedding, &lindices); >>>> CHKERRQ(ierr); >>>> ierr = PetscMemcpy(rindices, lindices, llen * >>>> sizeof(PetscInt)); >>>> CHKERRQ(ierr); >>>> ierr = ISDestroy(&lembedding); >>>> CHKERRQ(ierr); >>>> // We could get the index offset from a corresponding global >>>> vector, but subDMs don't yet >>>> // have global vectors >>>> ierr = ISGetLocalSize(dmm->_embedding, &len); >>>> CHKERRQ(ierr); >>>> >>>> ierr = MPI_Scan(&len, >>>> &off, >>>> 1, >>>> #ifdef PETSC_USE_64BIT_INDICES >>>> MPI_LONG_LONG_INT, >>>> #else >>>> MPI_INT, >>>> #endif >>>> MPI_SUM, >>>> ((PetscObject)dm)->comm); >>>> CHKERRQ(ierr); >>>> >>>> off -= len; >>>> for (i = 0; i < llen; ++i) >>>> rindices[i] += off; >>>> ierr = ISCreateGeneral( >>>> ((PetscObject)dm)->comm, llen, rindices, >>>> PETSC_OWN_POINTER, &(dinfo._rembedding)); >>>> CHKERRQ(ierr); >>>> } >>>> else >>>> { >>>> dinfo._rembedding = dembedding; >>>> } >>>> } >>>> ierr = PetscObjectReference((PetscObject)(dinfo._rembedding)); >>>> CHKERRQ(ierr); >>>> (*islist)[d] = dinfo._rembedding; >>>> } >>>> if (dmlist) >>>> { >>>> ierr = PetscObjectReference((PetscObject)dinfo._dm); >>>> CHKERRQ(ierr); >>>> (*dmlist)[d] = dinfo._dm; >>>> } >>>> } >>>> PetscFunctionReturn(0); >>>> } >>>> >>>> static PetscErrorCode >>>> DMCreateDomainDecomposition_Moose( >>>> DM dm, PetscInt * len, char *** namelist, IS ** innerislist, IS ** >>>> outerislist, DM ** dmlist) >>>> { >>>> PetscErrorCode ierr; >>>> >>>> PetscFunctionBegin; >>>> /* Use DMCreateFieldDecomposition_Moose() to obtain everything but >>>> outerislist, which is currently >>>> * PETSC_NULL. */ >>>> if (outerislist) >>>> *outerislist = PETSC_NULL; /* FIX: allow mesh-based overlap. */ >>>> ierr = DMCreateFieldDecomposition_Moose(dm, len, namelist, >>>> innerislist, dmlist); >>>> CHKERRQ(ierr); >>>> PetscFunctionReturn(0); >>>> } >>>> >>>> >>>> >>>> On Thu, Nov 3, 2022 at 5:19 PM Matthew Knepley >>>> wrote: >>>> >>>>> On Thu, Nov 3, 2022 at 7:52 PM Alexander Lindsay < >>>>> alexlindsay239 at gmail.com> wrote: >>>>> >>>>>> I have errors on quite a few (but not all) processes of the like >>>>>> >>>>>> [1]PETSC ERROR: --------------------- Error Message >>>>>> -------------------------------------------------------------- >>>>>> [1]PETSC ERROR: Nonconforming object sizes >>>>>> [1]PETSC ERROR: Local columns of A10 4137 do not equal local rows of >>>>>> A00 4129 >>>>>> >>>>>> when performing field splits. We (MOOSE) have some code for >>>>>> identifying the index sets for each split. However, the code was written by >>>>>> some authors who are no longer with us. Normally I would chase this down in >>>>>> a debugger, but this error only seems to crop up for pretty complex and >>>>>> large meshes. If anyone has an idea for what we might be doing wrong, that >>>>>> might help me chase this down faster. I guess intuitively I'm pretty >>>>>> perplexed that we could get ourselves into this pickle as it almost appears >>>>>> that we have two different local dof index counts for a given block (0 in >>>>>> this case). More background, if helpful, can be found in >>>>>> https://github.com/idaholab/moose/issues/22359 as well as >>>>>> https://github.com/idaholab/moose/discussions/22468. >>>>>> >>>>> >>>>> How are you specifying the blocks? I would not have thought this was >>>>> possible. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> I should note that we are currently running with 3.16.6 as our PETSc >>>>>> submodule hash (we are talking about updating to 3.18 soon). >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre at joliv.et Mon Nov 7 23:23:04 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Tue, 8 Nov 2022 06:23:04 +0100 Subject: [petsc-users] PETSc Windows Installation In-Reply-To: References: Message-ID: My bad, I sent non-updated instructions. Here is the proper list of packages: https://github.com/FreeFem/FreeFem-sources/tree/develop/etc/jenkins/deployRelease#windows (develop branch instead of master) If the pacman -R command fails, that?s no big deal. By default, PETSc configure picks up the wrong Python and assume that we are running Cygwin, which is wrong. You can force the use of the ? proper ? Python by using /usr/bin/python ./configure instead of just ./configure Thanks, Pierre > On 8 Nov 2022, at 1:06 AM, Mohammad Ali Yaqteen wrote: > > ? > Although I followed the steps in the explanation you sent me, these are the errors that I got: > > > > From: Pierre Jolivet > Sent: Monday, November 7, 2022 10:38 PM > To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: Re: [petsc-users] PETSc Windows Installation > > You are not running under a MinGW x64 shell, but a MinGW UCRT (Universal C Runtime) x64 shell instead. > This may work, but I?ve never tried it myself. > > Thanks, > Pierre > > > On 7 Nov 2022, at 2:29 PM, Mohammad Ali Yaqteen wrote: > > Do I have to follow all the steps in the first link as it says the following in step 6: > > $ pacman -S mingw-w64-ucrt-x86_64-gcc > resolving dependencies... > looking for conflicting packages... > > Packages (15) mingw-w64-ucrt-x86_64-binutils-2.39-2 > mingw-w64-ucrt-x86_64-crt-git-10.0.0.r68.g6eb571448-1 > mingw-w64-ucrt-x86_64-gcc-libs-12.2.0-1 mingw-w64-ucrt-x86_64-gmp-6.2.1-3 > mingw-w64-ucrt-x86_64-headers-git-10.0.0.r68.g6eb571448-1 > mingw-w64-ucrt-x86_64-isl-0.25-1 mingw-w64-ucrt-x86_64-libiconv-1.17-1 > mingw-w64-ucrt-x86_64-libwinpthread-git-10.0.0.r68.g6eb571448-1 > mingw-w64-ucrt-x86_64-mpc-1.2.1-1 mingw-w64-ucrt-x86_64-mpfr-4.1.0.p13-1 > mingw-w64-ucrt-x86_64-windows-default-manifest-6.4-4 > mingw-w64-ucrt-x86_64-winpthreads-git-10.0.0.r68.g6eb571448-1 > mingw-w64-ucrt-x86_64-zlib-1.2.12-1 mingw-w64-ucrt-x86_64-zstd-1.5.2-2 > mingw-w64-ucrt-x86_64-gcc-12.2.0-1 > > Total Installed Size: 397.59 MiB > > :: Proceed with installation? [Y/n] > [... downloading and installation continues ...] > > Thanks > Ali > > > From: Pierre Jolivet > Sent: Monday, November 7, 2022 10:06 PM > To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: Re: [petsc-users] PETSc Windows Installation > > Please, keep the list in copy. > You can get MSYS2 from https://www.msys2.org/ > Then install the following packages: https://github.com/FreeFem/FreeFem-sources/tree/master/etc/jenkins/deployRelease#windows-system > Also install MS-MPI: https://www.microsoft.com/en-us/download/details.aspx?id=100593 > Configure and compile PETSc under a MSYS2 MinGW x64 shell. > Compile your code, and copy the binary. > Notice in my screenshot that there are two shells, the MinGW one for building PETSc. > The Microsoft (native one) for launching the binary. > > Thanks, > Pierre > > > > > > On 7 Nov 2022, at 1:53 PM, Mohammad Ali Yaqteen wrote: > > Is there a guide for it? That would be very useful! Because I have been trying a lot of things but every now and then there is a little step that is either outdated or can?t run! > > Your help will be highly appreciated > > Thanks > Ali > > From: Pierre Jolivet > Sent: Monday, November 7, 2022 9:50 PM > To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: Re: [petsc-users] PETSc Windows Installation > > Or you can use MinGW, it?s not tricky, you don?t need to change any PETSc code, and you can ship .exe for either x86_64 (mingw-w64-x86_64-gcc) or ARM (mingw-w64-clang-aarch64-clang, without MPI). > > Thanks, > Pierre > > > > > On 7 Nov 2022, at 1:38 PM, hamid badi wrote: > > You can try gcc/clang cross-compilers, it's a little but tricky, i had to change some petsc codes but it works fine. > > Le lun. 7 nov. 2022 ? 13:30, Matthew Knepley a ?crit : > On Mon, Nov 7, 2022 at 7:11 AM Mohammad Ali Yaqteen wrote: > Once I finish writing the code, the .exe file will not change. Can I make an .exe file using WSL2 and VScode? > > If you build in WSL2, it will link to system libraries. You would probably need to run in WSL2 after that. If you are planning > on running on native Windows, you likely need to build there. > > Thanks, > > Matt > > Thanks, > Ali > > From: Matthew Knepley > Sent: Monday, November 7, 2022 7:13 PM > To: Mohammad Ali Yaqteen > Cc: petsc-users > Subject: Re: [petsc-users] PETSc Windows Installation > > On Mon, Nov 7, 2022 at 12:21 AM Mohammad Ali Yaqteen wrote: > I have written backend code for a software company. If WSL2 and VSCode(Linux) can be called through a command line and executed at the backend, then it will be great. But if I have to install WSL2 and other required things on every other PC that will run that software, then I think I will be at a disadvantage. What do you suggest? > > As long as you do not change the architecture and the compiler libraries are available, you can run the executable. > > Thanks, > > Matt > > Thank you > Ali > > -----Original Message----- > From: Satish Balay > Sent: Monday, November 7, 2022 12:00 AM > To: Matthew Knepley > Cc: Mohammad Ali Yaqteen ; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] PETSc Windows Installation > > Likely the compilers are not setup correctly as per instructions. > > https://petsc.org/release/install/windows/#installation-with-microsoft-intel-windows-compilers > > And if you do not have a specific windows need - and only need IDE - perhaps a WSL2 (aka linux) install with VSCode(linux) might be the way to go. > > Satish > > On Sun, 6 Nov 2022, Matthew Knepley wrote: > > > We need to see configure.log to see what is going on. Can you send it? > > > > Thanks, > > > > Matt > > > > On Sun, Nov 6, 2022 at 4:29 AM Mohammad Ali Yaqteen > > > > wrote: > > > > > Dear Sir/Madam, > > > > > > > > > > > > I am installing PETSc on windows but it keeps giving me unexpected errors. > > > I want to use it on MS Visual Studio or Codeblocks. When I use the > > > command on your webpage (./configure --with-cc='win32fe cl' > > > --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-mpi=0 > > > --download-fblaslapack), I get the following error message: > > > > > > > > > > > > $ ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' > > > --with-cxx='win32fe cl' --with-mpi=0 --download-fblaslapack > > > > > > > > > ==================================================================== > > > ========================= > > > > > > Configuring PETSc to compile on your system > > > > > > > > > ==================================================================== > > > ========================= > > > > > > TESTING: checkCCompiler from > > > config.setCompilers(config/BuildSystem/config/setCompilers.py:1341)* > > > ******************************************************************** > > > ********** > > > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > > > details): > > > > > > > > > -------------------------------------------------------------------- > > > ----------- > > > > > > C compiler you provided with -with-cc=win32fe cl cannot be found or > > > does not work. > > > > > > Cannot compile/link C with > > > /home/SEJONG/petsc-3.18.1/lib/petsc/bin/win32fe/win32fe cl. > > > > > > > > > > > > Kindly look into this problem! Your prompt response will highly be > > > appreciated > > > > > > > > > > > > Thank you > > > > > > Ali > > > > > > > > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.png Type: image/png Size: 14305 bytes Desc: not available URL: From francesc.levrero-florencio at ansys.com Tue Nov 8 09:15:04 2022 From: francesc.levrero-florencio at ansys.com (Francesc Levrero-Florencio) Date: Tue, 8 Nov 2022 15:15:04 +0000 Subject: [petsc-users] TSBEULER vs TSPSEUDO Message-ID: Hi PETSc people, We are running highly nonlinear quasi-static (steady-state) mechanical finite element problems with PETSc, currently using TSBEULER and the basic time adapt scheme. What we do in order to tackle these nonlinear problems is to parametrize the applied loads with the time in the TS and apply them incrementally. While this usually works well, we have seen instances in which the adaptor would reject the time step according to the calculated truncation errors, even if the SNES converges in a small number of iterations. Another issue that we have recently observed is that in a sequence of converged time steps the adaptor decides to start cutting the time step to smaller and smaller values using the low clip default value of TSAdaptGetClip (again because the truncation errors are high enough). What can we do in order to avoid these issues? The first one is avoided by using TSAdaptSetAlwaysAccept, but the latter remains. We have tried setting the low clip value to its maximum accepted value of 1, but then the time increment does not increase even if the SNES always converges in 3 or 4 iterations. Maybe a solution is to increase the tolerances of the TSAdapt? Another potential solution we have recently tried in order to tackle these issues is using TSPSEUDO (and deparametrizing the applied loads), but generally find that it takes a much longer time to reach an acceptable solution compared with TSBEULER. We have mostly used the default KSPONLY option, but we'd like to explore TSPSEUDO with NEWTONLS. A first question would be: what happens if the SNES fails to converge, does the solution get updated somehow in the corresponding time step? We have performed a few tests with TSPSEUDO and NEWTONLS, setting the maximum number of SNES iterations to a relatively low number (e.g. 5), and then always setting the SNES as converged in the poststage function, and found that it performs reasonably well, at least better than with the default KSPONLY (does this make any sense?). Thanks a lot! Regards, Francesc. -------------- next part -------------- An HTML attachment was scrubbed... URL: From longtuteng249 at gmail.com Tue Nov 8 09:27:55 2022 From: longtuteng249 at gmail.com (Jianbo Long) Date: Tue, 8 Nov 2022 16:27:55 +0100 Subject: [petsc-users] [petsc-maint] Issues linking petsc header files and lib from FORTRAN codes In-Reply-To: <68428260-9036-a81b-8a79-b6daf14667c0@mcs.anl.gov> References: <38802c8b-fca9-a502-57d1-7d52062662ea@mcs.anl.gov> <68428260-9036-a81b-8a79-b6daf14667c0@mcs.anl.gov> Message-ID: I am suspecting something else as well ... Could you elaborate more about "mixing c++ codes compiled with /usr/bin/g++ and compilers in /cluster/software/GCCcore/11.2.0" ? My own Fortran code does not have any c++ codes, and for some reason, the compiled petsc library is dependent on this libstdc++.so.6. I am sure about this because without linking the petsc, I don't have this libstdc++ trouble. Thanks, Jianbo On Mon, Nov 7, 2022 at 7:10 PM Satish Balay wrote: > Likely due to mixing c++ codes compiled with /usr/bin/g++ and compilers in > /cluster/software/GCCcore/11.2.0 > > if you still get this with --with-cxx=0 - then the issue with some other > [non-petsc library] > > Satish > > On Mon, 7 Nov 2022, Jianbo Long wrote: > > > Hi Satish, > > > > I wonder if you know anything about another issue: after compiling petsc > on > > a cluster, when I tried to link my Fortran code with compiled > libpetsc.so, > > the shared library, I got the following errors: > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.29' not found (required by > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > /lib64/libstdc++.so.6: version `CXXABI_1.3.8' not found (required by > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > > Not sure if it is related to discussion in this post ( > > https://gitlab.com/petsc/petsc/-/issues/997), but after I tried the > > configure option --with-cxx=0, I still got the same errors. > > My make.log file for compiling petsc is attached here. Also, the > > dependencies of the compiled petsc is: > > > > >>: ldd arch-linux-c-debug/lib/libpetsc.so > > linux-vdso.so.1 => (0x00007ffd80348000) > > libflexiblas.so.3 => > > /cluster/software/FlexiBLAS/3.0.4-GCC-11.2.0/lib/libflexiblas.so.3 > > (0x00007f6e8b93f000) > > libpthread.so.0 => /usr/lib64/libpthread.so.0 (0x00007f6e8b723000) > > libm.so.6 => /usr/lib64/libm.so.6 (0x00007f6e8b421000) > > libdl.so.2 => /usr/lib64/libdl.so.2 (0x00007f6e8b21d000) > > libmpi_usempif08.so.40 => > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempif08.so.40 > > (0x00007f6e8fd92000) > > libmpi_usempi_ignore_tkr.so.40 => > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempi_ignore_tkr.so.40 > > (0x00007f6e8fd84000) > > libmpi_mpifh.so.40 => > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_mpifh.so.40 > > (0x00007f6e8fd0c000) > > libmpi.so.40 => > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi.so.40 > > (0x00007f6e8fbfa000) > > libgfortran.so.5 => > /cluster/software/GCCcore/11.2.0/lib64/libgfortran.so.5 > > (0x00007f6e8af70000) > > libgcc_s.so.1 => /cluster/software/GCCcore/11.2.0/lib64/libgcc_s.so.1 > > (0x00007f6e8fbe0000) > > libquadmath.so.0 => > /cluster/software/GCCcore/11.2.0/lib64/libquadmath.so.0 > > (0x00007f6e8af28000) > > libc.so.6 => /usr/lib64/libc.so.6 (0x00007f6e8ab5a000) > > /lib64/ld-linux-x86-64.so.2 (0x00007f6e8fbb3000) > > libopen-rte.so.40 => > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-rte.so.40 > > (0x00007f6e8aa9e000) > > libopen-orted-mpir.so => > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-orted-mpir.so > > (0x00007f6e8fbdb000) > > libopen-pal.so.40 => > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-pal.so.40 > > (0x00007f6e8a9ea000) > > librt.so.1 => /lib64/librt.so.1 (0x00007f6e8a7d5000) > > libutil.so.1 => /lib64/libutil.so.1 (0x00007f6e8a5d2000) > > libhwloc.so.15 => > > /cluster/software/hwloc/2.5.0-GCCcore-11.2.0/lib/libhwloc.so.15 > > (0x00007f6e8a575000) > > libpciaccess.so.0 => > > /cluster/software/libpciaccess/0.16-GCCcore-11.2.0/lib/libpciaccess.so.0 > > (0x00007f6e8a56a000) > > libxml2.so.2 => > > /cluster/software/libxml2/2.9.10-GCCcore-11.2.0/lib/libxml2.so.2 > > (0x00007f6e8a3f6000) > > libz.so.1 => /cluster/software/zlib/1.2.11-GCCcore-11.2.0/lib/libz.so.1 > > (0x00007f6e8a3dd000) > > liblzma.so.5 => > /cluster/software/XZ/5.2.5-GCCcore-11.2.0/lib/liblzma.so.5 > > (0x00007f6e8a3b5000) > > libevent_core-2.0.so.5 => /lib64/libevent_core-2.0.so.5 > (0x00007f6e8a18a000) > > libevent_pthreads-2.0.so.5 => /lib64/libevent_pthreads-2.0.so.5 > > (0x00007f6e89f87000) > > > > Thanks very much, > > Jianbo > > > > On Mon, Nov 7, 2022 at 6:01 PM Satish Balay wrote: > > > > > Glad you have it working. Thanks for the update. > > > > > > Satish > > > > > > On Mon, 7 Nov 2022, Jianbo Long wrote: > > > > > > > Hi Satish and Barry, > > > > > > > > Thanks very much for the feedback ! > > > > > > > > It looks like my include file path was not correct ! > > > > > > > > Bests, > > > > Jianbo > > > > > > > > > > > > On Fri, Nov 4, 2022 at 6:08 AM Satish Balay > wrote: > > > > > > > > > For ex83f.F90: > > > > > > > > > > >>>>> > > > > > balay at p1 /home/balay/test > > > > > $ ls > > > > > ex83f.F90 > > > > > balay at p1 /home/balay/test > > > > > $ ls > > > > > ex83f.F90 > > > > > balay at p1 /home/balay/test > > > > > $ export PETSC_DIR=$HOME/petsc > > > > > balay at p1 /home/balay/test > > > > > $ cp $PETSC_DIR/src/ksp/ksp/tests/makefile . > > > > > balay at p1 /home/balay/test > > > > > $ make ex83f > > > > > mpif90 -fPIC -Wall -ffree-line-length-none -ffree-line-length-0 > > > > > -Wno-lto-type-mismatch -Wno-unused-dummy-argument -g -O0 > > > > > -I/home/balay/petsc/include > > > > > -I/home/balay/petsc/arch-linux-c-debug/include ex83f.F90 > > > > > -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib > > > > > -L/home/balay/petsc/arch-linux-c-debug/lib > > > > > -Wl,-rpath,/home/balay/soft/mpich-4.0.1/lib > > > > > -L/home/balay/soft/mpich-4.0.1/lib > > > > > -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/12 > > > > > -L/usr/lib/gcc/x86_64-redhat-linux/12 -lpetsc -llapack -lblas -lm > -lX11 > > > > > -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s > > > > > -lquadmath -lstdc++ -ldl -o ex83f > > > > > balay at p1 /home/balay/test > > > > > $ > > > > > <<<<<< > > > > > > > > > > Also when you are adding PETSc to your current project - are you > using > > > > > source files with .f or .f90 suffix? If so rename them to .F or > .F90 > > > suffix. > > > > > > > > > > If you still have issues send more details - As Barry indicated - > the > > > > > makefile [with the sources compiled by this makefile] - and the > > > compile log > > > > > when you attempt to build these sources with this makefile. > > > > > > > > > > Satish > > > > > > > > > > On Thu, 3 Nov 2022, Barry Smith wrote: > > > > > > > > > > > > > > > > > Please send your attempted makefile and we'll see if we can get > it > > > > > working. > > > > > > > > > > > > I am not sure if we can organize the include files as Fortran > > > compiler > > > > > include files easily. We've always used the preprocessor approach. > The > > > > > Intel compiler docs indicate the procedure for finding the Fortran > > > compiler > > > > > include files > > > > > > > > > https://www.intel.com/content/www/us/en/develop/documentation/fortran-compiler-oneapi-dev-guide-and-reference/top/program-structure/use-include-files.html > > > > > is the same as for the preprocessor include files so I don't > > > understand how > > > > > the using the Fortran compiler include file approach would make the > > > > > makefiles any simpler for users? > > > > > > > > > > > > > > > > > > Barry > > > > > > > > > > > > > > > > > > > On Nov 3, 2022, at 8:58 PM, Jianbo Long < > longtuteng249 at gmail.com> > > > > > wrote: > > > > > > > > > > > > > > Hello, > > > > > > > > > > > > > > I'm struggling to make my FORTRAN code work with petsc as I > cannot > > > > > link the required header files (e.g., petscksp.h) and compiled > library > > > > > files to my FORTRAN code. > > > > > > > > > > > > > > Compiling petsc was not a problem. However, even with the > fortran > > > > > examples (see those on https://petsc.org/main/docs/manual/fortran/ > ) > > > and > > > > > the guide on using petsc in c++ and fortran codes (see Section > "Writing > > > > > C/C++ or Fortran Applications" at > > > > > https://petsc.org/main/docs/manual/getting_started/), I still > cannot > > > make > > > > > my FORTRAN code work. > > > > > > > > > > > > > > The Fortran test code is exactly the example code ex83f.F90 > (see > > > > > attached files). Aftering following the 2nd method in the Guide > (see > > > the > > > > > picture below), I still get errors: > > > > > > > > > > > > > > petsc/finclude/petscksp.h: No such file or directory > > > > > > > > > > > > > > Even if I set up the path of the header file correctly in my > own > > > > > makefile without using environment variables, I still can only > find the > > > > > file "petscksp.h" for my code. Of course, the trouble is that all > other > > > > > headers files required by KSP are recursively included in this > > > petscksp.h > > > > > file, and I have no way to link them together for my Fortran code. > > > > > > > > > > > > > > So, here are my questions: > > > > > > > 1) in the Guide, how exactly are we supposed to set up the > > > environment > > > > > variables PETSC_DIR and PETSC_ARCH ? More details and examples > would > > > be > > > > > extremely helpful ! > > > > > > > 2) Is there a way to get rid of the preprocessor statement > > > > > > > #include > > > > > > > when using c++/Fortran codes ? > > > > > > > > > > > > > > For example, when using MUMPS package in a Fortran code, we can > > > simply > > > > > use compiler 'include', rather than a preprocessor, to extract all > > > required > > > > > variables for the user's codes : > > > > > > > INCLUDE 'zmumps_struc.h' > > > > > > > where the header file zmumps_struc.h is already provided in the > > > > > package. Similarly, I think it's much more portable and easier when > > > using > > > > > petsc in other codes if we can make it work to use petsc. > > > > > > > > > > > > > > (Note: similar issues were discussed before, see > > > > > > > > > https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2019-January/037499.html > > > . > > > > > Unfortunately, I have no clue about the solution archived there > ...) > > > > > > > > > > > > > > Any thoughts and solutions would be much appreciated ! > > > > > > > > > > > > > > Thanks, > > > > > > > Jianbo Long > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Nov 8 09:41:03 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 8 Nov 2022 09:41:03 -0600 (CST) Subject: [petsc-users] [petsc-maint] Issues linking petsc header files and lib from FORTRAN codes In-Reply-To: References: <38802c8b-fca9-a502-57d1-7d52062662ea@mcs.anl.gov> <68428260-9036-a81b-8a79-b6daf14667c0@mcs.anl.gov> Message-ID: <2f569ec1-97e9-5858-676a-fec3d768268e@mcs.anl.gov> You don't see 'libstdc++' in the output from 'ldd libptsc.so' below - so there is no reference to libstdc++ from petsc Try a clean build of PETSc and see if you still have these issues. ./configure --with-cc=gcc --with-cxx=0 --with-fc=gfortran --download-fblaslapack --download-mpich Another way to avoid this issue is to use /usr/bin/gcc, gfortran - i.e avoid using tools from /cluster/software/GCCcore Are they super old versions - that are not suitable? Satish On Tue, 8 Nov 2022, Jianbo Long wrote: > I am suspecting something else as well ... > > Could you elaborate more about "mixing c++ codes compiled with /usr/bin/g++ > and compilers in /cluster/software/GCCcore/11.2.0" ? My own Fortran code > does not have any c++ codes, and for some reason, the compiled petsc > library is dependent on this libstdc++.so.6. I am sure about this because > without linking the petsc, I don't have this libstdc++ trouble. > > Thanks, > Jianbo > > On Mon, Nov 7, 2022 at 7:10 PM Satish Balay wrote: > > > Likely due to mixing c++ codes compiled with /usr/bin/g++ and compilers in > > /cluster/software/GCCcore/11.2.0 > > > > if you still get this with --with-cxx=0 - then the issue with some other > > [non-petsc library] > > > > Satish > > > > On Mon, 7 Nov 2022, Jianbo Long wrote: > > > > > Hi Satish, > > > > > > I wonder if you know anything about another issue: after compiling petsc > > on > > > a cluster, when I tried to link my Fortran code with compiled > > libpetsc.so, > > > the shared library, I got the following errors: > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > > /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.29' not found (required by > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > > /lib64/libstdc++.so.6: version `CXXABI_1.3.8' not found (required by > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > > > > Not sure if it is related to discussion in this post ( > > > https://gitlab.com/petsc/petsc/-/issues/997), but after I tried the > > > configure option --with-cxx=0, I still got the same errors. > > > My make.log file for compiling petsc is attached here. Also, the > > > dependencies of the compiled petsc is: > > > > > > >>: ldd arch-linux-c-debug/lib/libpetsc.so > > > linux-vdso.so.1 => (0x00007ffd80348000) > > > libflexiblas.so.3 => > > > /cluster/software/FlexiBLAS/3.0.4-GCC-11.2.0/lib/libflexiblas.so.3 > > > (0x00007f6e8b93f000) > > > libpthread.so.0 => /usr/lib64/libpthread.so.0 (0x00007f6e8b723000) > > > libm.so.6 => /usr/lib64/libm.so.6 (0x00007f6e8b421000) > > > libdl.so.2 => /usr/lib64/libdl.so.2 (0x00007f6e8b21d000) > > > libmpi_usempif08.so.40 => > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempif08.so.40 > > > (0x00007f6e8fd92000) > > > libmpi_usempi_ignore_tkr.so.40 => > > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempi_ignore_tkr.so.40 > > > (0x00007f6e8fd84000) > > > libmpi_mpifh.so.40 => > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_mpifh.so.40 > > > (0x00007f6e8fd0c000) > > > libmpi.so.40 => > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi.so.40 > > > (0x00007f6e8fbfa000) > > > libgfortran.so.5 => > > /cluster/software/GCCcore/11.2.0/lib64/libgfortran.so.5 > > > (0x00007f6e8af70000) > > > libgcc_s.so.1 => /cluster/software/GCCcore/11.2.0/lib64/libgcc_s.so.1 > > > (0x00007f6e8fbe0000) > > > libquadmath.so.0 => > > /cluster/software/GCCcore/11.2.0/lib64/libquadmath.so.0 > > > (0x00007f6e8af28000) > > > libc.so.6 => /usr/lib64/libc.so.6 (0x00007f6e8ab5a000) > > > /lib64/ld-linux-x86-64.so.2 (0x00007f6e8fbb3000) > > > libopen-rte.so.40 => > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-rte.so.40 > > > (0x00007f6e8aa9e000) > > > libopen-orted-mpir.so => > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-orted-mpir.so > > > (0x00007f6e8fbdb000) > > > libopen-pal.so.40 => > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-pal.so.40 > > > (0x00007f6e8a9ea000) > > > librt.so.1 => /lib64/librt.so.1 (0x00007f6e8a7d5000) > > > libutil.so.1 => /lib64/libutil.so.1 (0x00007f6e8a5d2000) > > > libhwloc.so.15 => > > > /cluster/software/hwloc/2.5.0-GCCcore-11.2.0/lib/libhwloc.so.15 > > > (0x00007f6e8a575000) > > > libpciaccess.so.0 => > > > /cluster/software/libpciaccess/0.16-GCCcore-11.2.0/lib/libpciaccess.so.0 > > > (0x00007f6e8a56a000) > > > libxml2.so.2 => > > > /cluster/software/libxml2/2.9.10-GCCcore-11.2.0/lib/libxml2.so.2 > > > (0x00007f6e8a3f6000) > > > libz.so.1 => /cluster/software/zlib/1.2.11-GCCcore-11.2.0/lib/libz.so.1 > > > (0x00007f6e8a3dd000) > > > liblzma.so.5 => > > /cluster/software/XZ/5.2.5-GCCcore-11.2.0/lib/liblzma.so.5 > > > (0x00007f6e8a3b5000) > > > libevent_core-2.0.so.5 => /lib64/libevent_core-2.0.so.5 > > (0x00007f6e8a18a000) > > > libevent_pthreads-2.0.so.5 => /lib64/libevent_pthreads-2.0.so.5 > > > (0x00007f6e89f87000) > > > > > > Thanks very much, > > > Jianbo > > > > > > On Mon, Nov 7, 2022 at 6:01 PM Satish Balay wrote: > > > > > > > Glad you have it working. Thanks for the update. > > > > > > > > Satish > > > > > > > > On Mon, 7 Nov 2022, Jianbo Long wrote: > > > > > > > > > Hi Satish and Barry, > > > > > > > > > > Thanks very much for the feedback ! > > > > > > > > > > It looks like my include file path was not correct ! > > > > > > > > > > Bests, > > > > > Jianbo > > > > > > > > > > > > > > > On Fri, Nov 4, 2022 at 6:08 AM Satish Balay > > wrote: > > > > > > > > > > > For ex83f.F90: > > > > > > > > > > > > >>>>> > > > > > > balay at p1 /home/balay/test > > > > > > $ ls > > > > > > ex83f.F90 > > > > > > balay at p1 /home/balay/test > > > > > > $ ls > > > > > > ex83f.F90 > > > > > > balay at p1 /home/balay/test > > > > > > $ export PETSC_DIR=$HOME/petsc > > > > > > balay at p1 /home/balay/test > > > > > > $ cp $PETSC_DIR/src/ksp/ksp/tests/makefile . > > > > > > balay at p1 /home/balay/test > > > > > > $ make ex83f > > > > > > mpif90 -fPIC -Wall -ffree-line-length-none -ffree-line-length-0 > > > > > > -Wno-lto-type-mismatch -Wno-unused-dummy-argument -g -O0 > > > > > > -I/home/balay/petsc/include > > > > > > -I/home/balay/petsc/arch-linux-c-debug/include ex83f.F90 > > > > > > -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib > > > > > > -L/home/balay/petsc/arch-linux-c-debug/lib > > > > > > -Wl,-rpath,/home/balay/soft/mpich-4.0.1/lib > > > > > > -L/home/balay/soft/mpich-4.0.1/lib > > > > > > -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/12 > > > > > > -L/usr/lib/gcc/x86_64-redhat-linux/12 -lpetsc -llapack -lblas -lm > > -lX11 > > > > > > -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s > > > > > > -lquadmath -lstdc++ -ldl -o ex83f > > > > > > balay at p1 /home/balay/test > > > > > > $ > > > > > > <<<<<< > > > > > > > > > > > > Also when you are adding PETSc to your current project - are you > > using > > > > > > source files with .f or .f90 suffix? If so rename them to .F or > > .F90 > > > > suffix. > > > > > > > > > > > > If you still have issues send more details - As Barry indicated - > > the > > > > > > makefile [with the sources compiled by this makefile] - and the > > > > compile log > > > > > > when you attempt to build these sources with this makefile. > > > > > > > > > > > > Satish > > > > > > > > > > > > On Thu, 3 Nov 2022, Barry Smith wrote: > > > > > > > > > > > > > > > > > > > > Please send your attempted makefile and we'll see if we can get > > it > > > > > > working. > > > > > > > > > > > > > > I am not sure if we can organize the include files as Fortran > > > > compiler > > > > > > include files easily. We've always used the preprocessor approach. > > The > > > > > > Intel compiler docs indicate the procedure for finding the Fortran > > > > compiler > > > > > > include files > > > > > > > > > > > > https://www.intel.com/content/www/us/en/develop/documentation/fortran-compiler-oneapi-dev-guide-and-reference/top/program-structure/use-include-files.html > > > > > > is the same as for the preprocessor include files so I don't > > > > understand how > > > > > > the using the Fortran compiler include file approach would make the > > > > > > makefiles any simpler for users? > > > > > > > > > > > > > > > > > > > > > Barry > > > > > > > > > > > > > > > > > > > > > > On Nov 3, 2022, at 8:58 PM, Jianbo Long < > > longtuteng249 at gmail.com> > > > > > > wrote: > > > > > > > > > > > > > > > > Hello, > > > > > > > > > > > > > > > > I'm struggling to make my FORTRAN code work with petsc as I > > cannot > > > > > > link the required header files (e.g., petscksp.h) and compiled > > library > > > > > > files to my FORTRAN code. > > > > > > > > > > > > > > > > Compiling petsc was not a problem. However, even with the > > fortran > > > > > > examples (see those on https://petsc.org/main/docs/manual/fortran/ > > ) > > > > and > > > > > > the guide on using petsc in c++ and fortran codes (see Section > > "Writing > > > > > > C/C++ or Fortran Applications" at > > > > > > https://petsc.org/main/docs/manual/getting_started/), I still > > cannot > > > > make > > > > > > my FORTRAN code work. > > > > > > > > > > > > > > > > The Fortran test code is exactly the example code ex83f.F90 > > (see > > > > > > attached files). Aftering following the 2nd method in the Guide > > (see > > > > the > > > > > > picture below), I still get errors: > > > > > > > > > > > > > > > > petsc/finclude/petscksp.h: No such file or directory > > > > > > > > > > > > > > > > Even if I set up the path of the header file correctly in my > > own > > > > > > makefile without using environment variables, I still can only > > find the > > > > > > file "petscksp.h" for my code. Of course, the trouble is that all > > other > > > > > > headers files required by KSP are recursively included in this > > > > petscksp.h > > > > > > file, and I have no way to link them together for my Fortran code. > > > > > > > > > > > > > > > > So, here are my questions: > > > > > > > > 1) in the Guide, how exactly are we supposed to set up the > > > > environment > > > > > > variables PETSC_DIR and PETSC_ARCH ? More details and examples > > would > > > > be > > > > > > extremely helpful ! > > > > > > > > 2) Is there a way to get rid of the preprocessor statement > > > > > > > > #include > > > > > > > > when using c++/Fortran codes ? > > > > > > > > > > > > > > > > For example, when using MUMPS package in a Fortran code, we can > > > > simply > > > > > > use compiler 'include', rather than a preprocessor, to extract all > > > > required > > > > > > variables for the user's codes : > > > > > > > > INCLUDE 'zmumps_struc.h' > > > > > > > > where the header file zmumps_struc.h is already provided in the > > > > > > package. Similarly, I think it's much more portable and easier when > > > > using > > > > > > petsc in other codes if we can make it work to use petsc. > > > > > > > > > > > > > > > > (Note: similar issues were discussed before, see > > > > > > > > > > > > https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2019-January/037499.html > > > > . > > > > > > Unfortunately, I have no clue about the solution archived there > > ...) > > > > > > > > > > > > > > > > Any thoughts and solutions would be much appreciated ! > > > > > > > > > > > > > > > > Thanks, > > > > > > > > Jianbo Long > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > From knepley at gmail.com Tue Nov 8 09:41:49 2022 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 8 Nov 2022 10:41:49 -0500 Subject: [petsc-users] [petsc-maint] Issues linking petsc header files and lib from FORTRAN codes In-Reply-To: References: <38802c8b-fca9-a502-57d1-7d52062662ea@mcs.anl.gov> <68428260-9036-a81b-8a79-b6daf14667c0@mcs.anl.gov> Message-ID: On Tue, Nov 8, 2022 at 10:28 AM Jianbo Long wrote: > I am suspecting something else as well ... > > Could you elaborate more about "mixing c++ codes compiled with > /usr/bin/g++ and compilers in /cluster/software/GCCcore/11.2.0" ? My own > Fortran code does not have any c++ codes, and for some reason, the compiled > petsc library is dependent on this libstdc++.so.6. I am sure about this > because without linking the petsc, I don't have this libstdc++ trouble. > Are you sure it is not MPI that is bringing in C++? With --with-cxx=0, there should be no C++ in PETSc. However, we can test this. Can you ldd ${PETSC_ARCH}/lib/libpetsc.so Thanks, Matt > Thanks, > Jianbo > > On Mon, Nov 7, 2022 at 7:10 PM Satish Balay wrote: > >> Likely due to mixing c++ codes compiled with /usr/bin/g++ and compilers >> in /cluster/software/GCCcore/11.2.0 >> >> if you still get this with --with-cxx=0 - then the issue with some other >> [non-petsc library] >> >> Satish >> >> On Mon, 7 Nov 2022, Jianbo Long wrote: >> >> > Hi Satish, >> > >> > I wonder if you know anything about another issue: after compiling >> petsc on >> > a cluster, when I tried to link my Fortran code with compiled >> libpetsc.so, >> > the shared library, I got the following errors: >> > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: >> > /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by >> > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) >> > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: >> > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by >> > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) >> > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: >> > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.29' not found (required by >> > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) >> > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: >> > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by >> > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) >> > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: >> > /lib64/libstdc++.so.6: version `CXXABI_1.3.8' not found (required by >> > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) >> > >> > Not sure if it is related to discussion in this post ( >> > https://gitlab.com/petsc/petsc/-/issues/997), but after I tried the >> > configure option --with-cxx=0, I still got the same errors. >> > My make.log file for compiling petsc is attached here. Also, the >> > dependencies of the compiled petsc is: >> > >> > >>: ldd arch-linux-c-debug/lib/libpetsc.so >> > linux-vdso.so.1 => (0x00007ffd80348000) >> > libflexiblas.so.3 => >> > /cluster/software/FlexiBLAS/3.0.4-GCC-11.2.0/lib/libflexiblas.so.3 >> > (0x00007f6e8b93f000) >> > libpthread.so.0 => /usr/lib64/libpthread.so.0 (0x00007f6e8b723000) >> > libm.so.6 => /usr/lib64/libm.so.6 (0x00007f6e8b421000) >> > libdl.so.2 => /usr/lib64/libdl.so.2 (0x00007f6e8b21d000) >> > libmpi_usempif08.so.40 => >> > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempif08.so.40 >> > (0x00007f6e8fd92000) >> > libmpi_usempi_ignore_tkr.so.40 => >> > >> /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempi_ignore_tkr.so.40 >> > (0x00007f6e8fd84000) >> > libmpi_mpifh.so.40 => >> > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_mpifh.so.40 >> > (0x00007f6e8fd0c000) >> > libmpi.so.40 => >> /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi.so.40 >> > (0x00007f6e8fbfa000) >> > libgfortran.so.5 => >> /cluster/software/GCCcore/11.2.0/lib64/libgfortran.so.5 >> > (0x00007f6e8af70000) >> > libgcc_s.so.1 => /cluster/software/GCCcore/11.2.0/lib64/libgcc_s.so.1 >> > (0x00007f6e8fbe0000) >> > libquadmath.so.0 => >> /cluster/software/GCCcore/11.2.0/lib64/libquadmath.so.0 >> > (0x00007f6e8af28000) >> > libc.so.6 => /usr/lib64/libc.so.6 (0x00007f6e8ab5a000) >> > /lib64/ld-linux-x86-64.so.2 (0x00007f6e8fbb3000) >> > libopen-rte.so.40 => >> > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-rte.so.40 >> > (0x00007f6e8aa9e000) >> > libopen-orted-mpir.so => >> > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-orted-mpir.so >> > (0x00007f6e8fbdb000) >> > libopen-pal.so.40 => >> > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-pal.so.40 >> > (0x00007f6e8a9ea000) >> > librt.so.1 => /lib64/librt.so.1 (0x00007f6e8a7d5000) >> > libutil.so.1 => /lib64/libutil.so.1 (0x00007f6e8a5d2000) >> > libhwloc.so.15 => >> > /cluster/software/hwloc/2.5.0-GCCcore-11.2.0/lib/libhwloc.so.15 >> > (0x00007f6e8a575000) >> > libpciaccess.so.0 => >> > /cluster/software/libpciaccess/0.16-GCCcore-11.2.0/lib/libpciaccess.so.0 >> > (0x00007f6e8a56a000) >> > libxml2.so.2 => >> > /cluster/software/libxml2/2.9.10-GCCcore-11.2.0/lib/libxml2.so.2 >> > (0x00007f6e8a3f6000) >> > libz.so.1 => /cluster/software/zlib/1.2.11-GCCcore-11.2.0/lib/libz.so.1 >> > (0x00007f6e8a3dd000) >> > liblzma.so.5 => >> /cluster/software/XZ/5.2.5-GCCcore-11.2.0/lib/liblzma.so.5 >> > (0x00007f6e8a3b5000) >> > libevent_core-2.0.so.5 => /lib64/libevent_core-2.0.so.5 >> (0x00007f6e8a18a000) >> > libevent_pthreads-2.0.so.5 => /lib64/libevent_pthreads-2.0.so.5 >> > (0x00007f6e89f87000) >> > >> > Thanks very much, >> > Jianbo >> > >> > On Mon, Nov 7, 2022 at 6:01 PM Satish Balay wrote: >> > >> > > Glad you have it working. Thanks for the update. >> > > >> > > Satish >> > > >> > > On Mon, 7 Nov 2022, Jianbo Long wrote: >> > > >> > > > Hi Satish and Barry, >> > > > >> > > > Thanks very much for the feedback ! >> > > > >> > > > It looks like my include file path was not correct ! >> > > > >> > > > Bests, >> > > > Jianbo >> > > > >> > > > >> > > > On Fri, Nov 4, 2022 at 6:08 AM Satish Balay >> wrote: >> > > > >> > > > > For ex83f.F90: >> > > > > >> > > > > >>>>> >> > > > > balay at p1 /home/balay/test >> > > > > $ ls >> > > > > ex83f.F90 >> > > > > balay at p1 /home/balay/test >> > > > > $ ls >> > > > > ex83f.F90 >> > > > > balay at p1 /home/balay/test >> > > > > $ export PETSC_DIR=$HOME/petsc >> > > > > balay at p1 /home/balay/test >> > > > > $ cp $PETSC_DIR/src/ksp/ksp/tests/makefile . >> > > > > balay at p1 /home/balay/test >> > > > > $ make ex83f >> > > > > mpif90 -fPIC -Wall -ffree-line-length-none -ffree-line-length-0 >> > > > > -Wno-lto-type-mismatch -Wno-unused-dummy-argument -g -O0 >> > > > > -I/home/balay/petsc/include >> > > > > -I/home/balay/petsc/arch-linux-c-debug/include ex83f.F90 >> > > > > -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib >> > > > > -L/home/balay/petsc/arch-linux-c-debug/lib >> > > > > -Wl,-rpath,/home/balay/soft/mpich-4.0.1/lib >> > > > > -L/home/balay/soft/mpich-4.0.1/lib >> > > > > -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/12 >> > > > > -L/usr/lib/gcc/x86_64-redhat-linux/12 -lpetsc -llapack -lblas -lm >> -lX11 >> > > > > -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm >> -lgcc_s >> > > > > -lquadmath -lstdc++ -ldl -o ex83f >> > > > > balay at p1 /home/balay/test >> > > > > $ >> > > > > <<<<<< >> > > > > >> > > > > Also when you are adding PETSc to your current project - are you >> using >> > > > > source files with .f or .f90 suffix? If so rename them to .F or >> .F90 >> > > suffix. >> > > > > >> > > > > If you still have issues send more details - As Barry indicated - >> the >> > > > > makefile [with the sources compiled by this makefile] - and the >> > > compile log >> > > > > when you attempt to build these sources with this makefile. >> > > > > >> > > > > Satish >> > > > > >> > > > > On Thu, 3 Nov 2022, Barry Smith wrote: >> > > > > >> > > > > > >> > > > > > Please send your attempted makefile and we'll see if we can >> get it >> > > > > working. >> > > > > > >> > > > > > I am not sure if we can organize the include files as Fortran >> > > compiler >> > > > > include files easily. We've always used the preprocessor >> approach. The >> > > > > Intel compiler docs indicate the procedure for finding the Fortran >> > > compiler >> > > > > include files >> > > > > >> > > >> https://www.intel.com/content/www/us/en/develop/documentation/fortran-compiler-oneapi-dev-guide-and-reference/top/program-structure/use-include-files.html >> > > > > is the same as for the preprocessor include files so I don't >> > > understand how >> > > > > the using the Fortran compiler include file approach would make >> the >> > > > > makefiles any simpler for users? >> > > > > > >> > > > > > >> > > > > > Barry >> > > > > > >> > > > > > >> > > > > > > On Nov 3, 2022, at 8:58 PM, Jianbo Long < >> longtuteng249 at gmail.com> >> > > > > wrote: >> > > > > > > >> > > > > > > Hello, >> > > > > > > >> > > > > > > I'm struggling to make my FORTRAN code work with petsc as I >> cannot >> > > > > link the required header files (e.g., petscksp.h) and compiled >> library >> > > > > files to my FORTRAN code. >> > > > > > > >> > > > > > > Compiling petsc was not a problem. However, even with the >> fortran >> > > > > examples (see those on >> https://petsc.org/main/docs/manual/fortran/) >> > > and >> > > > > the guide on using petsc in c++ and fortran codes (see Section >> "Writing >> > > > > C/C++ or Fortran Applications" at >> > > > > https://petsc.org/main/docs/manual/getting_started/), I still >> cannot >> > > make >> > > > > my FORTRAN code work. >> > > > > > > >> > > > > > > The Fortran test code is exactly the example code ex83f.F90 >> (see >> > > > > attached files). Aftering following the 2nd method in the Guide >> (see >> > > the >> > > > > picture below), I still get errors: >> > > > > > > >> > > > > > > petsc/finclude/petscksp.h: No such file or directory >> > > > > > > >> > > > > > > Even if I set up the path of the header file correctly in my >> own >> > > > > makefile without using environment variables, I still can only >> find the >> > > > > file "petscksp.h" for my code. Of course, the trouble is that all >> other >> > > > > headers files required by KSP are recursively included in this >> > > petscksp.h >> > > > > file, and I have no way to link them together for my Fortran code. >> > > > > > > >> > > > > > > So, here are my questions: >> > > > > > > 1) in the Guide, how exactly are we supposed to set up the >> > > environment >> > > > > variables PETSC_DIR and PETSC_ARCH ? More details and examples >> would >> > > be >> > > > > extremely helpful ! >> > > > > > > 2) Is there a way to get rid of the preprocessor statement >> > > > > > > #include >> > > > > > > when using c++/Fortran codes ? >> > > > > > > >> > > > > > > For example, when using MUMPS package in a Fortran code, we >> can >> > > simply >> > > > > use compiler 'include', rather than a preprocessor, to extract all >> > > required >> > > > > variables for the user's codes : >> > > > > > > INCLUDE 'zmumps_struc.h' >> > > > > > > where the header file zmumps_struc.h is already provided in >> the >> > > > > package. Similarly, I think it's much more portable and easier >> when >> > > using >> > > > > petsc in other codes if we can make it work to use petsc. >> > > > > > > >> > > > > > > (Note: similar issues were discussed before, see >> > > > > >> > > >> https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2019-January/037499.html >> > > . >> > > > > Unfortunately, I have no clue about the solution archived there >> ...) >> > > > > > > >> > > > > > > Any thoughts and solutions would be much appreciated ! >> > > > > > > >> > > > > > > Thanks, >> > > > > > > Jianbo Long >> > > > > > > >> > > > > > > >> > > > > > > >> > > > > > >> > > > > > >> > > > > >> > > > > >> > > > >> > > >> > > >> > >> >> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Nov 8 09:43:35 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 8 Nov 2022 09:43:35 -0600 (CST) Subject: [petsc-users] [petsc-maint] Issues linking petsc header files and lib from FORTRAN codes In-Reply-To: <2f569ec1-97e9-5858-676a-fec3d768268e@mcs.anl.gov> References: <38802c8b-fca9-a502-57d1-7d52062662ea@mcs.anl.gov> <68428260-9036-a81b-8a79-b6daf14667c0@mcs.anl.gov> <2f569ec1-97e9-5858-676a-fec3d768268e@mcs.anl.gov> Message-ID: <9875541b-a9ba-8836-795b-2c35f2fa5aaf@mcs.anl.gov> On Tue, 8 Nov 2022, Satish Balay via petsc-users wrote: > You don't see 'libstdc++' in the output from 'ldd libptsc.so' below - so there is no reference > to libstdc++ from petsc > > Try a clean build of PETSc and see if you still have these issues. > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=gfortran --download-fblaslapack --download-mpich Perhaps good to also add: --with-hwloc=0 Satish > > Another way to avoid this issue is to use /usr/bin/gcc, gfortran - i.e avoid using tools from /cluster/software/GCCcore > Are they super old versions - that are not suitable? > > Satish > > > > On Tue, 8 Nov 2022, Jianbo Long wrote: > > > I am suspecting something else as well ... > > > > Could you elaborate more about "mixing c++ codes compiled with /usr/bin/g++ > > and compilers in /cluster/software/GCCcore/11.2.0" ? My own Fortran code > > does not have any c++ codes, and for some reason, the compiled petsc > > library is dependent on this libstdc++.so.6. I am sure about this because > > without linking the petsc, I don't have this libstdc++ trouble. > > > > Thanks, > > Jianbo > > > > On Mon, Nov 7, 2022 at 7:10 PM Satish Balay wrote: > > > > > Likely due to mixing c++ codes compiled with /usr/bin/g++ and compilers in > > > /cluster/software/GCCcore/11.2.0 > > > > > > if you still get this with --with-cxx=0 - then the issue with some other > > > [non-petsc library] > > > > > > Satish > > > > > > On Mon, 7 Nov 2022, Jianbo Long wrote: > > > > > > > Hi Satish, > > > > > > > > I wonder if you know anything about another issue: after compiling petsc > > > on > > > > a cluster, when I tried to link my Fortran code with compiled > > > libpetsc.so, > > > > the shared library, I got the following errors: > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > > > /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > > > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > > > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.29' not found (required by > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > > > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > > > /lib64/libstdc++.so.6: version `CXXABI_1.3.8' not found (required by > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > > > > > > Not sure if it is related to discussion in this post ( > > > > https://gitlab.com/petsc/petsc/-/issues/997), but after I tried the > > > > configure option --with-cxx=0, I still got the same errors. > > > > My make.log file for compiling petsc is attached here. Also, the > > > > dependencies of the compiled petsc is: > > > > > > > > >>: ldd arch-linux-c-debug/lib/libpetsc.so > > > > linux-vdso.so.1 => (0x00007ffd80348000) > > > > libflexiblas.so.3 => > > > > /cluster/software/FlexiBLAS/3.0.4-GCC-11.2.0/lib/libflexiblas.so.3 > > > > (0x00007f6e8b93f000) > > > > libpthread.so.0 => /usr/lib64/libpthread.so.0 (0x00007f6e8b723000) > > > > libm.so.6 => /usr/lib64/libm.so.6 (0x00007f6e8b421000) > > > > libdl.so.2 => /usr/lib64/libdl.so.2 (0x00007f6e8b21d000) > > > > libmpi_usempif08.so.40 => > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempif08.so.40 > > > > (0x00007f6e8fd92000) > > > > libmpi_usempi_ignore_tkr.so.40 => > > > > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempi_ignore_tkr.so.40 > > > > (0x00007f6e8fd84000) > > > > libmpi_mpifh.so.40 => > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_mpifh.so.40 > > > > (0x00007f6e8fd0c000) > > > > libmpi.so.40 => > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi.so.40 > > > > (0x00007f6e8fbfa000) > > > > libgfortran.so.5 => > > > /cluster/software/GCCcore/11.2.0/lib64/libgfortran.so.5 > > > > (0x00007f6e8af70000) > > > > libgcc_s.so.1 => /cluster/software/GCCcore/11.2.0/lib64/libgcc_s.so.1 > > > > (0x00007f6e8fbe0000) > > > > libquadmath.so.0 => > > > /cluster/software/GCCcore/11.2.0/lib64/libquadmath.so.0 > > > > (0x00007f6e8af28000) > > > > libc.so.6 => /usr/lib64/libc.so.6 (0x00007f6e8ab5a000) > > > > /lib64/ld-linux-x86-64.so.2 (0x00007f6e8fbb3000) > > > > libopen-rte.so.40 => > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-rte.so.40 > > > > (0x00007f6e8aa9e000) > > > > libopen-orted-mpir.so => > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-orted-mpir.so > > > > (0x00007f6e8fbdb000) > > > > libopen-pal.so.40 => > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-pal.so.40 > > > > (0x00007f6e8a9ea000) > > > > librt.so.1 => /lib64/librt.so.1 (0x00007f6e8a7d5000) > > > > libutil.so.1 => /lib64/libutil.so.1 (0x00007f6e8a5d2000) > > > > libhwloc.so.15 => > > > > /cluster/software/hwloc/2.5.0-GCCcore-11.2.0/lib/libhwloc.so.15 > > > > (0x00007f6e8a575000) > > > > libpciaccess.so.0 => > > > > /cluster/software/libpciaccess/0.16-GCCcore-11.2.0/lib/libpciaccess.so.0 > > > > (0x00007f6e8a56a000) > > > > libxml2.so.2 => > > > > /cluster/software/libxml2/2.9.10-GCCcore-11.2.0/lib/libxml2.so.2 > > > > (0x00007f6e8a3f6000) > > > > libz.so.1 => /cluster/software/zlib/1.2.11-GCCcore-11.2.0/lib/libz.so.1 > > > > (0x00007f6e8a3dd000) > > > > liblzma.so.5 => > > > /cluster/software/XZ/5.2.5-GCCcore-11.2.0/lib/liblzma.so.5 > > > > (0x00007f6e8a3b5000) > > > > libevent_core-2.0.so.5 => /lib64/libevent_core-2.0.so.5 > > > (0x00007f6e8a18a000) > > > > libevent_pthreads-2.0.so.5 => /lib64/libevent_pthreads-2.0.so.5 > > > > (0x00007f6e89f87000) > > > > > > > > Thanks very much, > > > > Jianbo > > > > > > > > On Mon, Nov 7, 2022 at 6:01 PM Satish Balay wrote: > > > > > > > > > Glad you have it working. Thanks for the update. > > > > > > > > > > Satish > > > > > > > > > > On Mon, 7 Nov 2022, Jianbo Long wrote: > > > > > > > > > > > Hi Satish and Barry, > > > > > > > > > > > > Thanks very much for the feedback ! > > > > > > > > > > > > It looks like my include file path was not correct ! > > > > > > > > > > > > Bests, > > > > > > Jianbo > > > > > > > > > > > > > > > > > > On Fri, Nov 4, 2022 at 6:08 AM Satish Balay > > > wrote: > > > > > > > > > > > > > For ex83f.F90: > > > > > > > > > > > > > > >>>>> > > > > > > > balay at p1 /home/balay/test > > > > > > > $ ls > > > > > > > ex83f.F90 > > > > > > > balay at p1 /home/balay/test > > > > > > > $ ls > > > > > > > ex83f.F90 > > > > > > > balay at p1 /home/balay/test > > > > > > > $ export PETSC_DIR=$HOME/petsc > > > > > > > balay at p1 /home/balay/test > > > > > > > $ cp $PETSC_DIR/src/ksp/ksp/tests/makefile . > > > > > > > balay at p1 /home/balay/test > > > > > > > $ make ex83f > > > > > > > mpif90 -fPIC -Wall -ffree-line-length-none -ffree-line-length-0 > > > > > > > -Wno-lto-type-mismatch -Wno-unused-dummy-argument -g -O0 > > > > > > > -I/home/balay/petsc/include > > > > > > > -I/home/balay/petsc/arch-linux-c-debug/include ex83f.F90 > > > > > > > -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib > > > > > > > -L/home/balay/petsc/arch-linux-c-debug/lib > > > > > > > -Wl,-rpath,/home/balay/soft/mpich-4.0.1/lib > > > > > > > -L/home/balay/soft/mpich-4.0.1/lib > > > > > > > -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/12 > > > > > > > -L/usr/lib/gcc/x86_64-redhat-linux/12 -lpetsc -llapack -lblas -lm > > > -lX11 > > > > > > > -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s > > > > > > > -lquadmath -lstdc++ -ldl -o ex83f > > > > > > > balay at p1 /home/balay/test > > > > > > > $ > > > > > > > <<<<<< > > > > > > > > > > > > > > Also when you are adding PETSc to your current project - are you > > > using > > > > > > > source files with .f or .f90 suffix? If so rename them to .F or > > > .F90 > > > > > suffix. > > > > > > > > > > > > > > If you still have issues send more details - As Barry indicated - > > > the > > > > > > > makefile [with the sources compiled by this makefile] - and the > > > > > compile log > > > > > > > when you attempt to build these sources with this makefile. > > > > > > > > > > > > > > Satish > > > > > > > > > > > > > > On Thu, 3 Nov 2022, Barry Smith wrote: > > > > > > > > > > > > > > > > > > > > > > > Please send your attempted makefile and we'll see if we can get > > > it > > > > > > > working. > > > > > > > > > > > > > > > > I am not sure if we can organize the include files as Fortran > > > > > compiler > > > > > > > include files easily. We've always used the preprocessor approach. > > > The > > > > > > > Intel compiler docs indicate the procedure for finding the Fortran > > > > > compiler > > > > > > > include files > > > > > > > > > > > > > > > https://www.intel.com/content/www/us/en/develop/documentation/fortran-compiler-oneapi-dev-guide-and-reference/top/program-structure/use-include-files.html > > > > > > > is the same as for the preprocessor include files so I don't > > > > > understand how > > > > > > > the using the Fortran compiler include file approach would make the > > > > > > > makefiles any simpler for users? > > > > > > > > > > > > > > > > > > > > > > > > Barry > > > > > > > > > > > > > > > > > > > > > > > > > On Nov 3, 2022, at 8:58 PM, Jianbo Long < > > > longtuteng249 at gmail.com> > > > > > > > wrote: > > > > > > > > > > > > > > > > > > Hello, > > > > > > > > > > > > > > > > > > I'm struggling to make my FORTRAN code work with petsc as I > > > cannot > > > > > > > link the required header files (e.g., petscksp.h) and compiled > > > library > > > > > > > files to my FORTRAN code. > > > > > > > > > > > > > > > > > > Compiling petsc was not a problem. However, even with the > > > fortran > > > > > > > examples (see those on https://petsc.org/main/docs/manual/fortran/ > > > ) > > > > > and > > > > > > > the guide on using petsc in c++ and fortran codes (see Section > > > "Writing > > > > > > > C/C++ or Fortran Applications" at > > > > > > > https://petsc.org/main/docs/manual/getting_started/), I still > > > cannot > > > > > make > > > > > > > my FORTRAN code work. > > > > > > > > > > > > > > > > > > The Fortran test code is exactly the example code ex83f.F90 > > > (see > > > > > > > attached files). Aftering following the 2nd method in the Guide > > > (see > > > > > the > > > > > > > picture below), I still get errors: > > > > > > > > > > > > > > > > > > petsc/finclude/petscksp.h: No such file or directory > > > > > > > > > > > > > > > > > > Even if I set up the path of the header file correctly in my > > > own > > > > > > > makefile without using environment variables, I still can only > > > find the > > > > > > > file "petscksp.h" for my code. Of course, the trouble is that all > > > other > > > > > > > headers files required by KSP are recursively included in this > > > > > petscksp.h > > > > > > > file, and I have no way to link them together for my Fortran code. > > > > > > > > > > > > > > > > > > So, here are my questions: > > > > > > > > > 1) in the Guide, how exactly are we supposed to set up the > > > > > environment > > > > > > > variables PETSC_DIR and PETSC_ARCH ? More details and examples > > > would > > > > > be > > > > > > > extremely helpful ! > > > > > > > > > 2) Is there a way to get rid of the preprocessor statement > > > > > > > > > #include > > > > > > > > > when using c++/Fortran codes ? > > > > > > > > > > > > > > > > > > For example, when using MUMPS package in a Fortran code, we can > > > > > simply > > > > > > > use compiler 'include', rather than a preprocessor, to extract all > > > > > required > > > > > > > variables for the user's codes : > > > > > > > > > INCLUDE 'zmumps_struc.h' > > > > > > > > > where the header file zmumps_struc.h is already provided in the > > > > > > > package. Similarly, I think it's much more portable and easier when > > > > > using > > > > > > > petsc in other codes if we can make it work to use petsc. > > > > > > > > > > > > > > > > > > (Note: similar issues were discussed before, see > > > > > > > > > > > > > > > https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2019-January/037499.html > > > > > . > > > > > > > Unfortunately, I have no clue about the solution archived there > > > ...) > > > > > > > > > > > > > > > > > > Any thoughts and solutions would be much appreciated ! > > > > > > > > > > > > > > > > > > Thanks, > > > > > > > > > Jianbo Long > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > From longtuteng249 at gmail.com Tue Nov 8 09:57:28 2022 From: longtuteng249 at gmail.com (Jianbo Long) Date: Tue, 8 Nov 2022 16:57:28 +0100 Subject: [petsc-users] [petsc-maint] Issues linking petsc header files and lib from FORTRAN codes In-Reply-To: <9875541b-a9ba-8836-795b-2c35f2fa5aaf@mcs.anl.gov> References: <38802c8b-fca9-a502-57d1-7d52062662ea@mcs.anl.gov> <68428260-9036-a81b-8a79-b6daf14667c0@mcs.anl.gov> <2f569ec1-97e9-5858-676a-fec3d768268e@mcs.anl.gov> <9875541b-a9ba-8836-795b-2c35f2fa5aaf@mcs.anl.gov> Message-ID: Here are the ldd outputs: >> ldd petsc_3.18_gnu/arch-linux-c-debug/lib/libpetsc.so linux-vdso.so.1 => (0x00007f23e5ff2000) libflexiblas.so.3 => /cluster/software/FlexiBLAS/3.0.4-GCC-11.2.0/lib/libflexiblas.so.3 (0x00007f23e1b60000) libpthread.so.0 => /usr/lib64/libpthread.so.0 (0x00007f23e1944000) libm.so.6 => /usr/lib64/libm.so.6 (0x00007f23e1642000) libdl.so.2 => /usr/lib64/libdl.so.2 (0x00007f23e143e000) libmpi_usempif08.so.40 => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempif08.so.40 (0x00007f23e5fb0000) libmpi_usempi_ignore_tkr.so.40 => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempi_ignore_tkr.so.40 (0x00007f23e5fa2000) libmpi_mpifh.so.40 => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_mpifh.so.40 (0x00007f23e5f2a000) libmpi.so.40 => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi.so.40 (0x00007f23e5e18000) libgfortran.so.5 => /cluster/software/GCCcore/11.2.0/lib64/libgfortran.so.5 (0x00007f23e1191000) libgcc_s.so.1 => /cluster/software/GCCcore/11.2.0/lib64/libgcc_s.so.1 (0x00007f23e5dfe000) libquadmath.so.0 => /cluster/software/GCCcore/11.2.0/lib64/libquadmath.so.0 (0x00007f23e1149000) libc.so.6 => /usr/lib64/libc.so.6 (0x00007f23e0d7b000) /lib64/ld-linux-x86-64.so.2 (0x00007f23e5dd3000) libopen-rte.so.40 => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-rte.so.40 (0x00007f23e0cbf000) libopen-orted-mpir.so => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-orted-mpir.so (0x00007f23e5df9000) libopen-pal.so.40 => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-pal.so.40 (0x00007f23e0c0b000) librt.so.1 => /lib64/librt.so.1 (0x00007f23e09f6000) libutil.so.1 => /lib64/libutil.so.1 (0x00007f23e07f3000) libhwloc.so.15 => /cluster/software/hwloc/2.5.0-GCCcore-11.2.0/lib/libhwloc.so.15 (0x00007f23e0796000) libpciaccess.so.0 => /cluster/software/libpciaccess/0.16-GCCcore-11.2.0/lib/libpciaccess.so.0 (0x00007f23e078b000) libxml2.so.2 => /cluster/software/libxml2/2.9.10-GCCcore-11.2.0/lib/libxml2.so.2 (0x00007f23e0617000) libz.so.1 => /cluster/software/zlib/1.2.11-GCCcore-11.2.0/lib/libz.so.1 (0x00007f23e05fe000) liblzma.so.5 => /cluster/software/XZ/5.2.5-GCCcore-11.2.0/lib/liblzma.so.5 (0x00007f23e05d6000) libevent_core-2.0.so.5 => /lib64/libevent_core-2.0.so.5 (0x00007f23e03ab000) libevent_pthreads-2.0.so.5 => /lib64/libevent_pthreads-2.0.so.5 (0x00007f23e01a8000) And /cluster/software/GCCcore/11.2.0 is pretty recent (around 2020/2021). You can see that I am using openmpi. Now I am trying compiling petsc without MPI. On Tue, Nov 8, 2022 at 4:43 PM Satish Balay wrote: > On Tue, 8 Nov 2022, Satish Balay via petsc-users wrote: > > > You don't see 'libstdc++' in the output from 'ldd libptsc.so' below - so > there is no reference > > to libstdc++ from petsc > > > > Try a clean build of PETSc and see if you still have these issues. > > > > ./configure --with-cc=gcc --with-cxx=0 --with-fc=gfortran > --download-fblaslapack --download-mpich > > Perhaps good to also add: --with-hwloc=0 > > Satish > > > > > Another way to avoid this issue is to use /usr/bin/gcc, gfortran - i.e > avoid using tools from /cluster/software/GCCcore > > Are they super old versions - that are not suitable? > > > > Satish > > > > > > > > On Tue, 8 Nov 2022, Jianbo Long wrote: > > > > > I am suspecting something else as well ... > > > > > > Could you elaborate more about "mixing c++ codes compiled with > /usr/bin/g++ > > > and compilers in /cluster/software/GCCcore/11.2.0" ? My own Fortran > code > > > does not have any c++ codes, and for some reason, the compiled petsc > > > library is dependent on this libstdc++.so.6. I am sure about this > because > > > without linking the petsc, I don't have this libstdc++ trouble. > > > > > > Thanks, > > > Jianbo > > > > > > On Mon, Nov 7, 2022 at 7:10 PM Satish Balay wrote: > > > > > > > Likely due to mixing c++ codes compiled with /usr/bin/g++ and > compilers in > > > > /cluster/software/GCCcore/11.2.0 > > > > > > > > if you still get this with --with-cxx=0 - then the issue with some > other > > > > [non-petsc library] > > > > > > > > Satish > > > > > > > > On Mon, 7 Nov 2022, Jianbo Long wrote: > > > > > > > > > Hi Satish, > > > > > > > > > > I wonder if you know anything about another issue: after compiling > petsc > > > > on > > > > > a cluster, when I tried to link my Fortran code with compiled > > > > libpetsc.so, > > > > > the shared library, I got the following errors: > > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > > > > /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required > by > > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > > > > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found > (required by > > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > > > > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.29' not found > (required by > > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > > > > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found > (required by > > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold: > > > > > /lib64/libstdc++.so.6: version `CXXABI_1.3.8' not found (required > by > > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold) > > > > > > > > > > Not sure if it is related to discussion in this post ( > > > > > https://gitlab.com/petsc/petsc/-/issues/997), but after I tried > the > > > > > configure option --with-cxx=0, I still got the same errors. > > > > > My make.log file for compiling petsc is attached here. Also, the > > > > > dependencies of the compiled petsc is: > > > > > > > > > > >>: ldd arch-linux-c-debug/lib/libpetsc.so > > > > > linux-vdso.so.1 => (0x00007ffd80348000) > > > > > libflexiblas.so.3 => > > > > > /cluster/software/FlexiBLAS/3.0.4-GCC-11.2.0/lib/libflexiblas.so.3 > > > > > (0x00007f6e8b93f000) > > > > > libpthread.so.0 => /usr/lib64/libpthread.so.0 (0x00007f6e8b723000) > > > > > libm.so.6 => /usr/lib64/libm.so.6 (0x00007f6e8b421000) > > > > > libdl.so.2 => /usr/lib64/libdl.so.2 (0x00007f6e8b21d000) > > > > > libmpi_usempif08.so.40 => > > > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempif08.so.40 > > > > > (0x00007f6e8fd92000) > > > > > libmpi_usempi_ignore_tkr.so.40 => > > > > > > > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempi_ignore_tkr.so.40 > > > > > (0x00007f6e8fd84000) > > > > > libmpi_mpifh.so.40 => > > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_mpifh.so.40 > > > > > (0x00007f6e8fd0c000) > > > > > libmpi.so.40 => > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi.so.40 > > > > > (0x00007f6e8fbfa000) > > > > > libgfortran.so.5 => > > > > /cluster/software/GCCcore/11.2.0/lib64/libgfortran.so.5 > > > > > (0x00007f6e8af70000) > > > > > libgcc_s.so.1 => > /cluster/software/GCCcore/11.2.0/lib64/libgcc_s.so.1 > > > > > (0x00007f6e8fbe0000) > > > > > libquadmath.so.0 => > > > > /cluster/software/GCCcore/11.2.0/lib64/libquadmath.so.0 > > > > > (0x00007f6e8af28000) > > > > > libc.so.6 => /usr/lib64/libc.so.6 (0x00007f6e8ab5a000) > > > > > /lib64/ld-linux-x86-64.so.2 (0x00007f6e8fbb3000) > > > > > libopen-rte.so.40 => > > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-rte.so.40 > > > > > (0x00007f6e8aa9e000) > > > > > libopen-orted-mpir.so => > > > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-orted-mpir.so > > > > > (0x00007f6e8fbdb000) > > > > > libopen-pal.so.40 => > > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-pal.so.40 > > > > > (0x00007f6e8a9ea000) > > > > > librt.so.1 => /lib64/librt.so.1 (0x00007f6e8a7d5000) > > > > > libutil.so.1 => /lib64/libutil.so.1 (0x00007f6e8a5d2000) > > > > > libhwloc.so.15 => > > > > > /cluster/software/hwloc/2.5.0-GCCcore-11.2.0/lib/libhwloc.so.15 > > > > > (0x00007f6e8a575000) > > > > > libpciaccess.so.0 => > > > > > > /cluster/software/libpciaccess/0.16-GCCcore-11.2.0/lib/libpciaccess.so.0 > > > > > (0x00007f6e8a56a000) > > > > > libxml2.so.2 => > > > > > /cluster/software/libxml2/2.9.10-GCCcore-11.2.0/lib/libxml2.so.2 > > > > > (0x00007f6e8a3f6000) > > > > > libz.so.1 => > /cluster/software/zlib/1.2.11-GCCcore-11.2.0/lib/libz.so.1 > > > > > (0x00007f6e8a3dd000) > > > > > liblzma.so.5 => > > > > /cluster/software/XZ/5.2.5-GCCcore-11.2.0/lib/liblzma.so.5 > > > > > (0x00007f6e8a3b5000) > > > > > libevent_core-2.0.so.5 => /lib64/libevent_core-2.0.so.5 > > > > (0x00007f6e8a18a000) > > > > > libevent_pthreads-2.0.so.5 => /lib64/libevent_pthreads-2.0.so.5 > > > > > (0x00007f6e89f87000) > > > > > > > > > > Thanks very much, > > > > > Jianbo > > > > > > > > > > On Mon, Nov 7, 2022 at 6:01 PM Satish Balay > wrote: > > > > > > > > > > > Glad you have it working. Thanks for the update. > > > > > > > > > > > > Satish > > > > > > > > > > > > On Mon, 7 Nov 2022, Jianbo Long wrote: > > > > > > > > > > > > > Hi Satish and Barry, > > > > > > > > > > > > > > Thanks very much for the feedback ! > > > > > > > > > > > > > > It looks like my include file path was not correct ! > > > > > > > > > > > > > > Bests, > > > > > > > Jianbo > > > > > > > > > > > > > > > > > > > > > On Fri, Nov 4, 2022 at 6:08 AM Satish Balay > > > > > wrote: > > > > > > > > > > > > > > > For ex83f.F90: > > > > > > > > > > > > > > > > >>>>> > > > > > > > > balay at p1 /home/balay/test > > > > > > > > $ ls > > > > > > > > ex83f.F90 > > > > > > > > balay at p1 /home/balay/test > > > > > > > > $ ls > > > > > > > > ex83f.F90 > > > > > > > > balay at p1 /home/balay/test > > > > > > > > $ export PETSC_DIR=$HOME/petsc > > > > > > > > balay at p1 /home/balay/test > > > > > > > > $ cp $PETSC_DIR/src/ksp/ksp/tests/makefile . > > > > > > > > balay at p1 /home/balay/test > > > > > > > > $ make ex83f > > > > > > > > mpif90 -fPIC -Wall -ffree-line-length-none > -ffree-line-length-0 > > > > > > > > -Wno-lto-type-mismatch -Wno-unused-dummy-argument -g -O0 > > > > > > > > -I/home/balay/petsc/include > > > > > > > > -I/home/balay/petsc/arch-linux-c-debug/include ex83f.F90 > > > > > > > > -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib > > > > > > > > -L/home/balay/petsc/arch-linux-c-debug/lib > > > > > > > > -Wl,-rpath,/home/balay/soft/mpich-4.0.1/lib > > > > > > > > -L/home/balay/soft/mpich-4.0.1/lib > > > > > > > > -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/12 > > > > > > > > -L/usr/lib/gcc/x86_64-redhat-linux/12 -lpetsc -llapack > -lblas -lm > > > > -lX11 > > > > > > > > -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm > -lgcc_s > > > > > > > > -lquadmath -lstdc++ -ldl -o ex83f > > > > > > > > balay at p1 /home/balay/test > > > > > > > > $ > > > > > > > > <<<<<< > > > > > > > > > > > > > > > > Also when you are adding PETSc to your current project - are > you > > > > using > > > > > > > > source files with .f or .f90 suffix? If so rename them to .F > or > > > > .F90 > > > > > > suffix. > > > > > > > > > > > > > > > > If you still have issues send more details - As Barry > indicated - > > > > the > > > > > > > > makefile [with the sources compiled by this makefile] - and > the > > > > > > compile log > > > > > > > > when you attempt to build these sources with this makefile. > > > > > > > > > > > > > > > > Satish > > > > > > > > > > > > > > > > On Thu, 3 Nov 2022, Barry Smith wrote: > > > > > > > > > > > > > > > > > > > > > > > > > > Please send your attempted makefile and we'll see if we > can get > > > > it > > > > > > > > working. > > > > > > > > > > > > > > > > > > I am not sure if we can organize the include files as > Fortran > > > > > > compiler > > > > > > > > include files easily. We've always used the preprocessor > approach. > > > > The > > > > > > > > Intel compiler docs indicate the procedure for finding the > Fortran > > > > > > compiler > > > > > > > > include files > > > > > > > > > > > > > > > > > > > https://www.intel.com/content/www/us/en/develop/documentation/fortran-compiler-oneapi-dev-guide-and-reference/top/program-structure/use-include-files.html > > > > > > > > is the same as for the preprocessor include files so I don't > > > > > > understand how > > > > > > > > the using the Fortran compiler include file approach would > make the > > > > > > > > makefiles any simpler for users? > > > > > > > > > > > > > > > > > > > > > > > > > > > Barry > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Nov 3, 2022, at 8:58 PM, Jianbo Long < > > > > longtuteng249 at gmail.com> > > > > > > > > wrote: > > > > > > > > > > > > > > > > > > > > Hello, > > > > > > > > > > > > > > > > > > > > I'm struggling to make my FORTRAN code work with petsc > as I > > > > cannot > > > > > > > > link the required header files (e.g., petscksp.h) and > compiled > > > > library > > > > > > > > files to my FORTRAN code. > > > > > > > > > > > > > > > > > > > > Compiling petsc was not a problem. However, even with the > > > > fortran > > > > > > > > examples (see those on > https://petsc.org/main/docs/manual/fortran/ > > > > ) > > > > > > and > > > > > > > > the guide on using petsc in c++ and fortran codes (see > Section > > > > "Writing > > > > > > > > C/C++ or Fortran Applications" at > > > > > > > > https://petsc.org/main/docs/manual/getting_started/), I > still > > > > cannot > > > > > > make > > > > > > > > my FORTRAN code work. > > > > > > > > > > > > > > > > > > > > The Fortran test code is exactly the example code > ex83f.F90 > > > > (see > > > > > > > > attached files). Aftering following the 2nd method in the > Guide > > > > (see > > > > > > the > > > > > > > > picture below), I still get errors: > > > > > > > > > > > > > > > > > > > > petsc/finclude/petscksp.h: No such file or directory > > > > > > > > > > > > > > > > > > > > Even if I set up the path of the header file correctly > in my > > > > own > > > > > > > > makefile without using environment variables, I still can > only > > > > find the > > > > > > > > file "petscksp.h" for my code. Of course, the trouble is > that all > > > > other > > > > > > > > headers files required by KSP are recursively included in > this > > > > > > petscksp.h > > > > > > > > file, and I have no way to link them together for my Fortran > code. > > > > > > > > > > > > > > > > > > > > So, here are my questions: > > > > > > > > > > 1) in the Guide, how exactly are we supposed to set up > the > > > > > > environment > > > > > > > > variables PETSC_DIR and PETSC_ARCH ? More details and > examples > > > > would > > > > > > be > > > > > > > > extremely helpful ! > > > > > > > > > > 2) Is there a way to get rid of the preprocessor > statement > > > > > > > > > > #include > > > > > > > > > > when using c++/Fortran codes ? > > > > > > > > > > > > > > > > > > > > For example, when using MUMPS package in a Fortran code, > we can > > > > > > simply > > > > > > > > use compiler 'include', rather than a preprocessor, to > extract all > > > > > > required > > > > > > > > variables for the user's codes : > > > > > > > > > > INCLUDE 'zmumps_struc.h' > > > > > > > > > > where the header file zmumps_struc.h is already provided > in the > > > > > > > > package. Similarly, I think it's much more portable and > easier when > > > > > > using > > > > > > > > petsc in other codes if we can make it work to use petsc. > > > > > > > > > > > > > > > > > > > > (Note: similar issues were discussed before, see > > > > > > > > > > > > > > > > > > > https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2019-January/037499.html > > > > > > . > > > > > > > > Unfortunately, I have no clue about the solution archived > there > > > > ...) > > > > > > > > > > > > > > > > > > > > Any thoughts and solutions would be much appreciated ! > > > > > > > > > > > > > > > > > > > > Thanks, > > > > > > > > > > Jianbo Long > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Tue Nov 8 11:04:48 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Tue, 8 Nov 2022 18:04:48 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation Message-ID: Hello Guys, Thanks to your suggestions on the block matrices, my fully coupled solver is proceeding very well! I am now about to take advantage of the block structure of the matrix using PCFIELDSPLIT. I have learned a bit from the user manual and followed with interest this discussion in the mailing list: https://lists.mcs.anl.gov/pipermail/petsc-users/2015-February/024154.html which is actually the exact same situation I am in, so I guess most of the command line options will be copy and paste. I would like however to code them in fortran, as I usually provide some default implementation alongside the command line options. While coding some of the options I got an error here in PCFieldSplitSetFields() which looks to be undefined. I am importing petscksp, do I need to import something else maybe? Thank you! -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Tue Nov 8 11:09:40 2022 From: jed at jedbrown.org (Jed Brown) Date: Tue, 08 Nov 2022 10:09:40 -0700 Subject: [petsc-users] TSBEULER vs TSPSEUDO In-Reply-To: References: Message-ID: <87bkph5q5n.fsf@jedbrown.org> First, I believe arc-length continuation is the right approach in this problem domain. I have a branch starting an implementation, but need to revisit it in light of some feedback (and time has been too short lately). My group's nonlinear mechanics solver uses TSBEULER because it's convenient to parametrize loading on T=[0,1]. Unlike arc-length continuation, this can't handle snap-through effects. TSPSEUDO is the usual recommendation if you don't care about time accuracy, though you could register a custom controller for normal TS methods that implements any logic you'd like around automatically extending the time step without using a truncation error estimate. Note that displacement loading (as usually implemented) is really bad (especially for models with plasticity) because increments that are large relative to the mesh size can invert elements or initiate plastic yielding when that would not happen if using smaller increments. Arc-length continuation also helps fix that problem. Note that you can use extrapolation (-ts_theta_initial_guess_extrapolate), though I've found this to be somewhat brittle and only reduce SNES iteration count by about 1 per time step. Francesc Levrero-Florencio writes: > Hi PETSc people, > > We are running highly nonlinear quasi-static (steady-state) mechanical finite element problems with PETSc, currently using TSBEULER and the basic time adapt scheme. > > What we do in order to tackle these nonlinear problems is to parametrize the applied loads with the time in the TS and apply them incrementally. While this usually works well, we have seen instances in which the adaptor would reject the time step according to the calculated truncation errors, even if the SNES converges in a small number of iterations. Another issue that we have recently observed is that in a sequence of converged time steps the adaptor decides to start cutting the time step to smaller and smaller values using the low clip default value of TSAdaptGetClip (again because the truncation errors are high enough). What can we do in order to avoid these issues? The first one is avoided by using TSAdaptSetAlwaysAccept, but the latter remains. We have tried setting the low clip value to its maximum accepted value of 1, but then the time increment does not increase even if the SNES always converges in 3 or 4 iterations. Maybe a solution is to increase the tolerances of the TSAdapt? > > Another potential solution we have recently tried in order to tackle these issues is using TSPSEUDO (and deparametrizing the applied loads), but generally find that it takes a much longer time to reach an acceptable solution compared with TSBEULER. We have mostly used the default KSPONLY option, but we'd like to explore TSPSEUDO with NEWTONLS. A first question would be: what happens if the SNES fails to converge, does the solution get updated somehow in the corresponding time step? We have performed a few tests with TSPSEUDO and NEWTONLS, setting the maximum number of SNES iterations to a relatively low number (e.g. 5), and then always setting the SNES as converged in the poststage function, and found that it performs reasonably well, at least better than with the default KSPONLY (does this make any sense?). > > Thanks a lot! > > Regards, > Francesc. From knepley at gmail.com Tue Nov 8 11:15:02 2022 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 8 Nov 2022 12:15:02 -0500 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: On Tue, Nov 8, 2022 at 12:05 PM Edoardo alinovi wrote: > Hello Guys, > > Thanks to your suggestions on the block matrices, my fully coupled solver > is proceeding very well! > > I am now about to take advantage of the block structure of the matrix > using PCFIELDSPLIT. I have learned a bit from the user manual and followed > with interest this discussion in the mailing list: > https://lists.mcs.anl.gov/pipermail/petsc-users/2015-February/024154.html > which is actually the exact same situation I am in, so I guess most of the > command line options will be copy and paste. > > I would like however to code them in fortran, as I usually provide some > default implementation alongside the command line options. > > While coding some of the options I got an error here > in PCFieldSplitSetFields() which looks to be undefined. I am importing > petscksp, do I need to import something else maybe? > Since it uses arrays, we will have to write the Fortran wrapper by hand. I will see if I can do it soon. Thanks, Matt > Thank you! > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From francesc.levrero-florencio at ansys.com Tue Nov 8 11:29:27 2022 From: francesc.levrero-florencio at ansys.com (Francesc Levrero-Florencio) Date: Tue, 8 Nov 2022 17:29:27 +0000 Subject: [petsc-users] TSBEULER vs TSPSEUDO In-Reply-To: <87bkph5q5n.fsf@jedbrown.org> References: <87bkph5q5n.fsf@jedbrown.org> Message-ID: Hi Jed, Thanks for the answer. We do have a monolithic arc-length implementation based on the TS/SNES logic, but we are also exploring having a custom SNESSHELL because the arc-length logic is substantially more complex than that of traditional load-controlled continuation methods. It works quite well, the only "issue" is its initiation; we are currently performing load-control (or displacement loading as you mentioned) in the first time increment. Besides load-control and arc-length control, what other continuation methods would you suggest exploring? The test problem we are dealing with assumes plasticity but with small strains so we will not see any snap-throughs, snap-backs or similar. TSBEULER works quite well for this specific case and converges in a few time steps within around 5-10 SNES iterations per time step. What PETSc functions do you suggest exploring for implementing the TS time step extension control you mentioned? Since you mentioned -ts_theta_initial_guess_extrapolate, is it worth using it in highly nonlinear mechanical problems (such as plasticity)? It sounds quite useful if it consistently reduces SNES iterations by one per time step, as each linear solve is quite expensive for large problems. Regards, Francesc. ________________________________ From: Jed Brown Sent: 08 November 2022 17:09 To: Francesc Levrero-Florencio ; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] TSBEULER vs TSPSEUDO [External Sender] First, I believe arc-length continuation is the right approach in this problem domain. I have a branch starting an implementation, but need to revisit it in light of some feedback (and time has been too short lately). My group's nonlinear mechanics solver uses TSBEULER because it's convenient to parametrize loading on T=[0,1]. Unlike arc-length continuation, this can't handle snap-through effects. TSPSEUDO is the usual recommendation if you don't care about time accuracy, though you could register a custom controller for normal TS methods that implements any logic you'd like around automatically extending the time step without using a truncation error estimate. Note that displacement loading (as usually implemented) is really bad (especially for models with plasticity) because increments that are large relative to the mesh size can invert elements or initiate plastic yielding when that would not happen if using smaller increments. Arc-length continuation also helps fix that problem. Note that you can use extrapolation (-ts_theta_initial_guess_extrapolate), though I've found this to be somewhat brittle and only reduce SNES iteration count by about 1 per time step. Francesc Levrero-Florencio writes: > Hi PETSc people, > > We are running highly nonlinear quasi-static (steady-state) mechanical finite element problems with PETSc, currently using TSBEULER and the basic time adapt scheme. > > What we do in order to tackle these nonlinear problems is to parametrize the applied loads with the time in the TS and apply them incrementally. While this usually works well, we have seen instances in which the adaptor would reject the time step according to the calculated truncation errors, even if the SNES converges in a small number of iterations. Another issue that we have recently observed is that in a sequence of converged time steps the adaptor decides to start cutting the time step to smaller and smaller values using the low clip default value of TSAdaptGetClip (again because the truncation errors are high enough). What can we do in order to avoid these issues? The first one is avoided by using TSAdaptSetAlwaysAccept, but the latter remains. We have tried setting the low clip value to its maximum accepted value of 1, but then the time increment does not increase even if the SNES always converges in 3 or 4 iterations. Maybe a solution is to increase the tolerances of the TSAdapt? > > Another potential solution we have recently tried in order to tackle these issues is using TSPSEUDO (and deparametrizing the applied loads), but generally find that it takes a much longer time to reach an acceptable solution compared with TSBEULER. We have mostly used the default KSPONLY option, but we'd like to explore TSPSEUDO with NEWTONLS. A first question would be: what happens if the SNES fails to converge, does the solution get updated somehow in the corresponding time step? We have performed a few tests with TSPSEUDO and NEWTONLS, setting the maximum number of SNES iterations to a relatively low number (e.g. 5), and then always setting the SNES as converged in the poststage function, and found that it performs reasonably well, at least better than with the default KSPONLY (does this make any sense?). > > Thanks a lot! > > Regards, > Francesc. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Tue Nov 8 11:45:28 2022 From: jed at jedbrown.org (Jed Brown) Date: Tue, 08 Nov 2022 10:45:28 -0700 Subject: [petsc-users] TSBEULER vs TSPSEUDO In-Reply-To: References: <87bkph5q5n.fsf@jedbrown.org> Message-ID: <877d055ohz.fsf@jedbrown.org> Francesc Levrero-Florencio writes: > Hi Jed, > > Thanks for the answer. > > We do have a monolithic arc-length implementation based on the TS/SNES logic, but we are also exploring having a custom SNESSHELL because the arc-length logic is substantially more complex than that of traditional load-controlled continuation methods. It works quite well, the only "issue" is its initiation; we are currently performing load-control (or displacement loading as you mentioned) in the first time increment. Besides load-control and arc-length control, what other continuation methods would you suggest exploring? Those are the main ones, and they're all simple expressions for the constraint condition that you have to handle for arc-length methods, thus suitable to make extensible. Wriggers' book has a nice discussion and table. I imagine we'll get some more experience with the tradeoffs after I add it to SNES. > The test problem we are dealing with assumes plasticity but with small strains so we will not see any snap-throughs, snap-backs or similar. TSBEULER works quite well for this specific case and converges in a few time steps within around 5-10 SNES iterations per time step. What PETSc functions do you suggest exploring for implementing the TS time step extension control you mentioned? Check out src/ts/adapt/impls/ for the current implementations. > Since you mentioned -ts_theta_initial_guess_extrapolate, is it worth using it in highly nonlinear mechanical problems (such as plasticity)? It sounds quite useful if it consistently reduces SNES iterations by one per time step, as each linear solve is quite expensive for large problems. I found sometimes it overshoots and thus causes problems, so effectiveness was problem-dependent. It's just a run-time flag so check it out. I'm curious if you have experience using BFGS with Jacobian scaling (either a V-cycle or a sparse direct solve) instead of Newton. You can try it using -snes_type qn -snes_qn_scale_type jacobian. This can greatly reduce the number of assemblies and preconditioner setups, and we find it also reduces total number of V-cycles so is effective even with our matrix-free p-MG (which are very fast and have much lower setup costs, https://arxiv.org/abs/2204.01722). From alexlindsay239 at gmail.com Tue Nov 8 18:53:02 2022 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Tue, 8 Nov 2022 16:53:02 -0800 Subject: [petsc-users] Local columns of A10 do not equal local rows of A00 In-Reply-To: References: Message-ID: This is from our DMCreateFieldDecomposition_Moose routine. The IS size on process 1 (which is the process from which I took the error in the original post) is reported as 4129 which is consistent with the row size of A00. Split '0' has local size 4129 on processor 1 Split '0' has local size 4484 on processor 6 Split '0' has local size 4471 on processor 12 Split '0' has local size 4040 on processor 14 Split '0' has local size 3594 on processor 20 Split '0' has local size 4423 on processor 22 Split '0' has local size 2791 on processor 27 Split '0' has local size 3014 on processor 29 Split '0' has local size 3183 on processor 30 Split '0' has local size 3328 on processor 3 Split '0' has local size 4689 on processor 4 Split '0' has local size 8016 on processor 8 Split '0' has local size 6367 on processor 10 Split '0' has local size 5973 on processor 17 Split '0' has local size 4431 on processor 18 Split '0' has local size 7564 on processor 25 Split '0' has local size 12504 on processor 9 Split '0' has local size 10081 on processor 11 Split '0' has local size 13808 on processor 24 Split '0' has local size 14049 on processor 31 Split '0' has local size 15324 on processor 7 Split '0' has local size 15337 on processor 15 Split '0' has local size 14849 on processor 19 Split '0' has local size 15660 on processor 23 Split '0' has local size 14728 on processor 26 Split '0' has local size 15724 on processor 28 Split '0' has local size 17249 on processor 5 Split '0' has local size 15519 on processor 13 Split '0' has local size 16511 on processor 16 Split '0' has local size 16496 on processor 21 Split '0' has local size 18291 on processor 2 Split '0' has local size 18042 on processor 0 On Mon, Nov 7, 2022 at 6:04 PM Matthew Knepley wrote: > On Mon, Nov 7, 2022 at 5:48 PM Alexander Lindsay > wrote: > >> My understanding looking at PCFieldSplitSetDefaults is that our >> implementation of `createfielddecomposition` should get called, we'll set >> `fields` and then (ignoring possible user setting of >> -pc_fieldsplit_%D_fields flag) PCFieldSplitSetIS will get called with >> whatever we did to `fields`. So yea I guess that just looking over that I >> would assume we're not supplying two different index sets for rows and >> columns, or put more precisely we (MOOSE) are not really afforded the >> opportunity to. But my interpretation could very well be wrong. >> > > Oh wait. I read the error message again. It does not say that the whole > selection is rectangular. It says > > Local columns of A10 4137 do not equal local rows of A00 4129 > > So this is a parallel partitioning thing. Since A00 has 4129 local rows, > it should have this many columns as well. > However A10 has 4137 local columns. How big is IS_0, on each process, that > you pass in to PCFIELDSPLIT? > > Thanks, > > Matt > > >> On Mon, Nov 7, 2022 at 12:33 PM Matthew Knepley >> wrote: >> >>> On Mon, Nov 7, 2022 at 2:09 PM Alexander Lindsay < >>> alexlindsay239 at gmail.com> wrote: >>> >>>> The libMesh/MOOSE specific code that identifies dof indices for >>>> ISCreateGeneral is in DMooseGetEmbedding_Private. I can share that function >>>> (it's quite long) or more details if that could be helpful. >>>> >>> >>> Sorry, I should have written more. The puzzling thing for me is that >>> somehow it looks like the row and column index sets are not the same. I did >>> not think >>> PCFIELDSPLIT could do that. The PCFieldSplitSetIS() interface does not >>> allow it. I was wondering how you were setting the ISes. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> On Mon, Nov 7, 2022 at 10:55 AM Alexander Lindsay < >>>> alexlindsay239 at gmail.com> wrote: >>>> >>>>> I'm not sure exactly what you mean, but I'll try to give more details. >>>>> We have our own DM class (DM_Moose) and we set our own field and domain >>>>> decomposition routines: >>>>> >>>>> dm->ops->createfielddecomposition = >>>>> DMCreateFieldDecomposition_Moose; >>>>> >>>>> dm->ops->createdomaindecomposition = >>>>> DMCreateDomainDecomposition_Moose; >>>>> >>>>> >>>>> The field and domain decomposition routines are as follows (can see >>>>> also at >>>>> https://github.com/idaholab/moose/blob/next/framework/src/utils/PetscDMMoose.C >>>>> ): >>>>> >>>>> static PetscErrorCode >>>>> DMCreateFieldDecomposition_Moose( >>>>> DM dm, PetscInt * len, char *** namelist, IS ** islist, DM ** >>>>> dmlist) >>>>> { >>>>> PetscErrorCode ierr; >>>>> DM_Moose * dmm = (DM_Moose *)(dm->data); >>>>> >>>>> PetscFunctionBegin; >>>>> /* Only called after DMSetUp(). */ >>>>> if (!dmm->_splitlocs) >>>>> PetscFunctionReturn(0); >>>>> *len = dmm->_splitlocs->size(); >>>>> if (namelist) >>>>> { >>>>> ierr = PetscMalloc(*len * sizeof(char *), namelist); >>>>> CHKERRQ(ierr); >>>>> } >>>>> if (islist) >>>>> { >>>>> ierr = PetscMalloc(*len * sizeof(IS), islist); >>>>> CHKERRQ(ierr); >>>>> } >>>>> if (dmlist) >>>>> { >>>>> ierr = PetscMalloc(*len * sizeof(DM), dmlist); >>>>> CHKERRQ(ierr); >>>>> } >>>>> for (const auto & dit : *(dmm->_splitlocs)) >>>>> { >>>>> unsigned int d = dit.second; >>>>> std::string dname = dit.first; >>>>> DM_Moose::SplitInfo & dinfo = (*dmm->_splits)[dname]; >>>>> if (!dinfo._dm) >>>>> { >>>>> ierr = DMCreateMoose(((PetscObject)dm)->comm, *dmm->_nl, >>>>> &dinfo._dm); >>>>> CHKERRQ(ierr); >>>>> ierr = PetscObjectSetOptionsPrefix((PetscObject)dinfo._dm, >>>>> ((PetscObject)dm)->prefix); >>>>> CHKERRQ(ierr); >>>>> std::string suffix = std::string("fieldsplit_") + dname + "_"; >>>>> ierr = PetscObjectAppendOptionsPrefix((PetscObject)dinfo._dm, >>>>> suffix.c_str()); >>>>> CHKERRQ(ierr); >>>>> } >>>>> ierr = DMSetFromOptions(dinfo._dm); >>>>> CHKERRQ(ierr); >>>>> ierr = DMSetUp(dinfo._dm); >>>>> CHKERRQ(ierr); >>>>> if (namelist) >>>>> { >>>>> ierr = PetscStrallocpy(dname.c_str(), (*namelist) + d); >>>>> CHKERRQ(ierr); >>>>> } >>>>> if (islist) >>>>> { >>>>> if (!dinfo._rembedding) >>>>> { >>>>> IS dembedding, lembedding; >>>>> ierr = DMMooseGetEmbedding_Private(dinfo._dm, &dembedding); >>>>> CHKERRQ(ierr); >>>>> if (dmm->_embedding) >>>>> { >>>>> // Create a relative embedding into the parent's index space. >>>>> ierr = ISEmbed(dembedding, dmm->_embedding, PETSC_TRUE, >>>>> &lembedding); >>>>> CHKERRQ(ierr); >>>>> const PetscInt * lindices; >>>>> PetscInt len, dlen, llen, *rindices, off, i; >>>>> ierr = ISGetLocalSize(dembedding, &dlen); >>>>> CHKERRQ(ierr); >>>>> ierr = ISGetLocalSize(lembedding, &llen); >>>>> CHKERRQ(ierr); >>>>> if (llen != dlen) >>>>> SETERRQ1(((PetscObject)dm)->comm, PETSC_ERR_PLIB, "Failed >>>>> to embed split %D", d); >>>>> ierr = ISDestroy(&dembedding); >>>>> CHKERRQ(ierr); >>>>> // Convert local embedding to global (but still relative) >>>>> embedding >>>>> ierr = PetscMalloc(llen * sizeof(PetscInt), &rindices); >>>>> CHKERRQ(ierr); >>>>> ierr = ISGetIndices(lembedding, &lindices); >>>>> CHKERRQ(ierr); >>>>> ierr = PetscMemcpy(rindices, lindices, llen * >>>>> sizeof(PetscInt)); >>>>> CHKERRQ(ierr); >>>>> ierr = ISDestroy(&lembedding); >>>>> CHKERRQ(ierr); >>>>> // We could get the index offset from a corresponding global >>>>> vector, but subDMs don't yet >>>>> // have global vectors >>>>> ierr = ISGetLocalSize(dmm->_embedding, &len); >>>>> CHKERRQ(ierr); >>>>> >>>>> ierr = MPI_Scan(&len, >>>>> &off, >>>>> 1, >>>>> #ifdef PETSC_USE_64BIT_INDICES >>>>> MPI_LONG_LONG_INT, >>>>> #else >>>>> MPI_INT, >>>>> #endif >>>>> MPI_SUM, >>>>> ((PetscObject)dm)->comm); >>>>> CHKERRQ(ierr); >>>>> >>>>> off -= len; >>>>> for (i = 0; i < llen; ++i) >>>>> rindices[i] += off; >>>>> ierr = ISCreateGeneral( >>>>> ((PetscObject)dm)->comm, llen, rindices, >>>>> PETSC_OWN_POINTER, &(dinfo._rembedding)); >>>>> CHKERRQ(ierr); >>>>> } >>>>> else >>>>> { >>>>> dinfo._rembedding = dembedding; >>>>> } >>>>> } >>>>> ierr = PetscObjectReference((PetscObject)(dinfo._rembedding)); >>>>> CHKERRQ(ierr); >>>>> (*islist)[d] = dinfo._rembedding; >>>>> } >>>>> if (dmlist) >>>>> { >>>>> ierr = PetscObjectReference((PetscObject)dinfo._dm); >>>>> CHKERRQ(ierr); >>>>> (*dmlist)[d] = dinfo._dm; >>>>> } >>>>> } >>>>> PetscFunctionReturn(0); >>>>> } >>>>> >>>>> static PetscErrorCode >>>>> DMCreateDomainDecomposition_Moose( >>>>> DM dm, PetscInt * len, char *** namelist, IS ** innerislist, IS ** >>>>> outerislist, DM ** dmlist) >>>>> { >>>>> PetscErrorCode ierr; >>>>> >>>>> PetscFunctionBegin; >>>>> /* Use DMCreateFieldDecomposition_Moose() to obtain everything but >>>>> outerislist, which is currently >>>>> * PETSC_NULL. */ >>>>> if (outerislist) >>>>> *outerislist = PETSC_NULL; /* FIX: allow mesh-based overlap. */ >>>>> ierr = DMCreateFieldDecomposition_Moose(dm, len, namelist, >>>>> innerislist, dmlist); >>>>> CHKERRQ(ierr); >>>>> PetscFunctionReturn(0); >>>>> } >>>>> >>>>> >>>>> >>>>> On Thu, Nov 3, 2022 at 5:19 PM Matthew Knepley >>>>> wrote: >>>>> >>>>>> On Thu, Nov 3, 2022 at 7:52 PM Alexander Lindsay < >>>>>> alexlindsay239 at gmail.com> wrote: >>>>>> >>>>>>> I have errors on quite a few (but not all) processes of the like >>>>>>> >>>>>>> [1]PETSC ERROR: --------------------- Error Message >>>>>>> -------------------------------------------------------------- >>>>>>> [1]PETSC ERROR: Nonconforming object sizes >>>>>>> [1]PETSC ERROR: Local columns of A10 4137 do not equal local rows of >>>>>>> A00 4129 >>>>>>> >>>>>>> when performing field splits. We (MOOSE) have some code for >>>>>>> identifying the index sets for each split. However, the code was written by >>>>>>> some authors who are no longer with us. Normally I would chase this down in >>>>>>> a debugger, but this error only seems to crop up for pretty complex and >>>>>>> large meshes. If anyone has an idea for what we might be doing wrong, that >>>>>>> might help me chase this down faster. I guess intuitively I'm pretty >>>>>>> perplexed that we could get ourselves into this pickle as it almost appears >>>>>>> that we have two different local dof index counts for a given block (0 in >>>>>>> this case). More background, if helpful, can be found in >>>>>>> https://github.com/idaholab/moose/issues/22359 as well as >>>>>>> https://github.com/idaholab/moose/discussions/22468. >>>>>>> >>>>>> >>>>>> How are you specifying the blocks? I would not have thought this was >>>>>> possible. >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> I should note that we are currently running with 3.16.6 as our PETSc >>>>>>> submodule hash (we are talking about updating to 3.18 soon). >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>> >>>>>> >>>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bourdin at mcmaster.ca Tue Nov 8 20:13:56 2022 From: bourdin at mcmaster.ca (Blaise Bourdin) Date: Wed, 9 Nov 2022 02:13:56 +0000 Subject: [petsc-users] Reference element in DMPlexComputeCellGeometryAffineFEM Message-ID: Hi, What reference simplex is DMPlexComputeCellGeometryAffineFEM using in 2 and 3D? I am used to computing my shape functions on the unit simplex (vertices at the origin and each e_i), but it does not look to be the reference simplex in this function: In 3D, for the unit simplex with vertices at (0,0,0) (1,0,0) (0,1,0) (0,0,1) (in this order), I get J = 1 / 2 . [[-1,-1,-1],[1,0,0],[0,0,1]] and v0 = [0,0,1] In 2D, for the unit simplex with vertices at (0,0), (1,0), and (0,1), I get J = 1 / 2. I and v0 = [0,0], which does not make any sense to me (I was assuming that the 2D reference simplex had vertices at (-1,-1), (1, -1) and (-1,1), but if this were the case, v0 would not be 0). I can build a simple example with meshes consisting only of the unit simplex in 2D and 3D if that would help. Regards, Blaise ? Canada Research Chair in Mathematical and Computational Aspects of Solid Mechanics (Tier 1) Professor, Department of Mathematics & Statistics Hamilton Hall room 409A, McMaster University 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243 From edoardo.alinovi at gmail.com Wed Nov 9 01:20:06 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Wed, 9 Nov 2022 08:20:06 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: Hello guys, I am getting this error while using fieldsplit: [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- *[3]PETSC ERROR: Nonconforming object sizes[3]PETSC ERROR: Local column sizes 6132 do not add up to total number of columns 9200* [3]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [3]PETSC ERROR: Petsc Development GIT revision: v3.18.1-191-g32ed6ae2ff2 GIT Date: 2022-11-08 12:22:17 -0500 [3]PETSC ERROR: flubio_coupled on a gnu named alienware by edo Wed Nov 9 08:16:29 2022 [3]PETSC ERROR: Configure options PETSC_ARCH=gnu FOPTFLAGS=-O3 COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -download-fblaslapack=1 -download-superlu_dist -download-mumps -download-hypre -download-metis -download-parmetis -download-scalapack -download-ml -download-slepc -download-hpddm -download-cmake -with-mpi-dir=/home/edo/software/openmpi-4.1.1/build/ [3]PETSC ERROR: #1 MatCreateSubMatrix_MPIBAIJ_Private() at /home/edo/software/petsc/src/mat/impls/baij/mpi/mpibaij.c:1987 [3]PETSC ERROR: #2 MatCreateSubMatrix_MPIBAIJ() at /home/edo/software/petsc/src/mat/impls/baij/mpi/mpibaij.c:1911 [3]PETSC ERROR: #3 MatCreateSubMatrix() at /home/edo/software/petsc/src/mat/interface/matrix.c:8340 [3]PETSC ERROR: #4 PCSetUp_FieldSplit() at /home/edo/software/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:657 [3]PETSC ERROR: #5 PCSetUp() at /home/edo/software/petsc/src/ksp/pc/interface/precon.c:994 [3]PETSC ERROR: #6 KSPSetUp() at /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:406 [3]PETSC ERROR: #7 KSPSolve_Private() at /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:825 [3]PETSC ERROR: #8 KSPSolve() at /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:1071 Do you have any ideas? Probably something missing in my brief implementation here: * call PCSetType(mypc, PCFIELDSPLIT, ierr) call PCFieldSplitSetBlockSize(mypc, 4-bdim, ierr) * * !2D, 3x3 block if(bdim==1) then ufields(1) = 0 ufields(2) = 1 pfields(1) = 2 call PCFieldSplitSetFields(mypc, "u", 2, ufields, ufields, ierr) call PCFieldSplitSetFields(mypc, "p", 1, pfields, pfields, ierr) ! 3D 4x4 block else ufields(1) = 0 ufields(2) = 1 ufields(3) = 2 pfields(1) = 3 call PCFieldSplitSetFields(mypc, "u", 3, ufields, ufields, ierr) call PCFieldSplitSetFields(mypc, "p", 1, pfields, pfields, ierr) endif ! Field split type ADDITIVE, MULTIPLICATIVE (default), SYMMETRIC_MULTIPLICATIVE, SPECIAL, SCHUR call PCFieldSplitSetType(mypc, PC_COMPOSITE_SCHUR, ierr)* Thanks for the help! -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 9 06:52:33 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 9 Nov 2022 07:52:33 -0500 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: Fields are numbered from 0. Thanks, Matt On Wed, Nov 9, 2022 at 2:20 AM Edoardo alinovi wrote: > Hello guys, > > I am getting this error while using fieldsplit: > > [3]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > *[3]PETSC ERROR: Nonconforming object sizes[3]PETSC ERROR: Local column > sizes 6132 do not add up to total number of columns 9200* > [3]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [3]PETSC ERROR: Petsc Development GIT revision: v3.18.1-191-g32ed6ae2ff2 > GIT Date: 2022-11-08 12:22:17 -0500 > [3]PETSC ERROR: flubio_coupled on a gnu named alienware by edo Wed Nov 9 > 08:16:29 2022 > [3]PETSC ERROR: Configure options PETSC_ARCH=gnu FOPTFLAGS=-O3 > COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -download-fblaslapack=1 > -download-superlu_dist -download-mumps -download-hypre -download-metis > -download-parmetis -download-scalapack -download-ml -download-slepc > -download-hpddm -download-cmake > -with-mpi-dir=/home/edo/software/openmpi-4.1.1/build/ > [3]PETSC ERROR: #1 MatCreateSubMatrix_MPIBAIJ_Private() at > /home/edo/software/petsc/src/mat/impls/baij/mpi/mpibaij.c:1987 > [3]PETSC ERROR: #2 MatCreateSubMatrix_MPIBAIJ() at > /home/edo/software/petsc/src/mat/impls/baij/mpi/mpibaij.c:1911 > [3]PETSC ERROR: #3 MatCreateSubMatrix() at > /home/edo/software/petsc/src/mat/interface/matrix.c:8340 > [3]PETSC ERROR: #4 PCSetUp_FieldSplit() at > /home/edo/software/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:657 > [3]PETSC ERROR: #5 PCSetUp() at > /home/edo/software/petsc/src/ksp/pc/interface/precon.c:994 > [3]PETSC ERROR: #6 KSPSetUp() at > /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:406 > [3]PETSC ERROR: #7 KSPSolve_Private() at > /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:825 > [3]PETSC ERROR: #8 KSPSolve() at > /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:1071 > > Do you have any ideas? Probably something missing in my brief > implementation here: > > > > > * call PCSetType(mypc, PCFIELDSPLIT, ierr) call > PCFieldSplitSetBlockSize(mypc, 4-bdim, ierr) * > > > > > > > > > > > > > > > > > > > > > > > * !2D, 3x3 block if(bdim==1) then > ufields(1) = 0 ufields(2) = 1 pfields(1) = 2 > call PCFieldSplitSetFields(mypc, "u", 2, ufields, ufields, > ierr) call PCFieldSplitSetFields(mypc, "p", 1, pfields, > pfields, ierr) ! 3D 4x4 block else > ufields(1) = 0 ufields(2) = 1 ufields(3) = 2 > pfields(1) = 3 call > PCFieldSplitSetFields(mypc, "u", 3, ufields, ufields, ierr) > call PCFieldSplitSetFields(mypc, "p", 1, pfields, pfields, ierr) > endif ! Field split type ADDITIVE, MULTIPLICATIVE > (default), SYMMETRIC_MULTIPLICATIVE, SPECIAL, SCHUR call > PCFieldSplitSetType(mypc, PC_COMPOSITE_SCHUR, ierr)* > > Thanks for the help! > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Wed Nov 9 06:54:55 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Wed, 9 Nov 2022 13:54:55 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: Even in the fortran interface? Il Mer 9 Nov 2022, 13:52 Matthew Knepley ha scritto: > Fields are numbered from 0. > > Thanks, > > Matt > > On Wed, Nov 9, 2022 at 2:20 AM Edoardo alinovi > wrote: > >> Hello guys, >> >> I am getting this error while using fieldsplit: >> >> [3]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> >> *[3]PETSC ERROR: Nonconforming object sizes[3]PETSC ERROR: Local column >> sizes 6132 do not add up to total number of columns 9200* >> [3]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. >> [3]PETSC ERROR: Petsc Development GIT revision: v3.18.1-191-g32ed6ae2ff2 >> GIT Date: 2022-11-08 12:22:17 -0500 >> [3]PETSC ERROR: flubio_coupled on a gnu named alienware by edo Wed Nov 9 >> 08:16:29 2022 >> [3]PETSC ERROR: Configure options PETSC_ARCH=gnu FOPTFLAGS=-O3 >> COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -download-fblaslapack=1 >> -download-superlu_dist -download-mumps -download-hypre -download-metis >> -download-parmetis -download-scalapack -download-ml -download-slepc >> -download-hpddm -download-cmake >> -with-mpi-dir=/home/edo/software/openmpi-4.1.1/build/ >> [3]PETSC ERROR: #1 MatCreateSubMatrix_MPIBAIJ_Private() at >> /home/edo/software/petsc/src/mat/impls/baij/mpi/mpibaij.c:1987 >> [3]PETSC ERROR: #2 MatCreateSubMatrix_MPIBAIJ() at >> /home/edo/software/petsc/src/mat/impls/baij/mpi/mpibaij.c:1911 >> [3]PETSC ERROR: #3 MatCreateSubMatrix() at >> /home/edo/software/petsc/src/mat/interface/matrix.c:8340 >> [3]PETSC ERROR: #4 PCSetUp_FieldSplit() at >> /home/edo/software/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:657 >> [3]PETSC ERROR: #5 PCSetUp() at >> /home/edo/software/petsc/src/ksp/pc/interface/precon.c:994 >> [3]PETSC ERROR: #6 KSPSetUp() at >> /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:406 >> [3]PETSC ERROR: #7 KSPSolve_Private() at >> /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:825 >> [3]PETSC ERROR: #8 KSPSolve() at >> /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:1071 >> >> Do you have any ideas? Probably something missing in my brief >> implementation here: >> >> >> >> >> * call PCSetType(mypc, PCFIELDSPLIT, ierr) call >> PCFieldSplitSetBlockSize(mypc, 4-bdim, ierr) * >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> * !2D, 3x3 block if(bdim==1) then >> ufields(1) = 0 ufields(2) = 1 pfields(1) = 2 >> call PCFieldSplitSetFields(mypc, "u", 2, ufields, ufields, >> ierr) call PCFieldSplitSetFields(mypc, "p", 1, pfields, >> pfields, ierr) ! 3D 4x4 block else >> ufields(1) = 0 ufields(2) = 1 ufields(3) = 2 >> pfields(1) = 3 call >> PCFieldSplitSetFields(mypc, "u", 3, ufields, ufields, ierr) >> call PCFieldSplitSetFields(mypc, "p", 1, pfields, pfields, ierr) >> endif ! Field split type ADDITIVE, MULTIPLICATIVE >> (default), SYMMETRIC_MULTIPLICATIVE, SPECIAL, SCHUR call >> PCFieldSplitSetType(mypc, PC_COMPOSITE_SCHUR, ierr)* >> >> Thanks for the help! >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Wed Nov 9 06:57:17 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Wed, 9 Nov 2022 13:57:17 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: To be clear, You are suggesting to use ufields(0)=0, ufields(1)=1 and so on? Il Mer 9 Nov 2022, 13:54 Edoardo alinovi ha scritto: > Even in the fortran interface? > > Il Mer 9 Nov 2022, 13:52 Matthew Knepley ha scritto: > >> Fields are numbered from 0. >> >> Thanks, >> >> Matt >> >> On Wed, Nov 9, 2022 at 2:20 AM Edoardo alinovi >> wrote: >> >>> Hello guys, >>> >>> I am getting this error while using fieldsplit: >>> >>> [3]PETSC ERROR: --------------------- Error Message >>> -------------------------------------------------------------- >>> >>> *[3]PETSC ERROR: Nonconforming object sizes[3]PETSC ERROR: Local column >>> sizes 6132 do not add up to total number of columns 9200* >>> [3]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. >>> [3]PETSC ERROR: Petsc Development GIT revision: v3.18.1-191-g32ed6ae2ff2 >>> GIT Date: 2022-11-08 12:22:17 -0500 >>> [3]PETSC ERROR: flubio_coupled on a gnu named alienware by edo Wed Nov >>> 9 08:16:29 2022 >>> [3]PETSC ERROR: Configure options PETSC_ARCH=gnu FOPTFLAGS=-O3 >>> COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -download-fblaslapack=1 >>> -download-superlu_dist -download-mumps -download-hypre -download-metis >>> -download-parmetis -download-scalapack -download-ml -download-slepc >>> -download-hpddm -download-cmake >>> -with-mpi-dir=/home/edo/software/openmpi-4.1.1/build/ >>> [3]PETSC ERROR: #1 MatCreateSubMatrix_MPIBAIJ_Private() at >>> /home/edo/software/petsc/src/mat/impls/baij/mpi/mpibaij.c:1987 >>> [3]PETSC ERROR: #2 MatCreateSubMatrix_MPIBAIJ() at >>> /home/edo/software/petsc/src/mat/impls/baij/mpi/mpibaij.c:1911 >>> [3]PETSC ERROR: #3 MatCreateSubMatrix() at >>> /home/edo/software/petsc/src/mat/interface/matrix.c:8340 >>> [3]PETSC ERROR: #4 PCSetUp_FieldSplit() at >>> /home/edo/software/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:657 >>> [3]PETSC ERROR: #5 PCSetUp() at >>> /home/edo/software/petsc/src/ksp/pc/interface/precon.c:994 >>> [3]PETSC ERROR: #6 KSPSetUp() at >>> /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:406 >>> [3]PETSC ERROR: #7 KSPSolve_Private() at >>> /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:825 >>> [3]PETSC ERROR: #8 KSPSolve() at >>> /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>> >>> Do you have any ideas? Probably something missing in my brief >>> implementation here: >>> >>> >>> >>> >>> * call PCSetType(mypc, PCFIELDSPLIT, ierr) call >>> PCFieldSplitSetBlockSize(mypc, 4-bdim, ierr) * >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> * !2D, 3x3 block if(bdim==1) then >>> ufields(1) = 0 ufields(2) = 1 pfields(1) = 2 >>> call PCFieldSplitSetFields(mypc, "u", 2, ufields, ufields, >>> ierr) call PCFieldSplitSetFields(mypc, "p", 1, pfields, >>> pfields, ierr) ! 3D 4x4 block else >>> ufields(1) = 0 ufields(2) = 1 ufields(3) = 2 >>> pfields(1) = 3 call >>> PCFieldSplitSetFields(mypc, "u", 3, ufields, ufields, ierr) >>> call PCFieldSplitSetFields(mypc, "p", 1, pfields, pfields, ierr) >>> endif ! Field split type ADDITIVE, MULTIPLICATIVE >>> (default), SYMMETRIC_MULTIPLICATIVE, SPECIAL, SCHUR call >>> PCFieldSplitSetType(mypc, PC_COMPOSITE_SCHUR, ierr)* >>> >>> Thanks for the help! >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 9 07:07:14 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 9 Nov 2022 08:07:14 -0500 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: On Wed, Nov 9, 2022 at 7:57 AM Edoardo alinovi wrote: > To be clear, > > You are suggesting to use ufields(0)=0, ufields(1)=1 and so on? > I think you are right. Those should start from 1. However, your ISes do not seem to cover the whole matrix. Can you start with a very small problem so that you can print them to the screen? Thanks, Matt > Il Mer 9 Nov 2022, 13:54 Edoardo alinovi ha > scritto: > >> Even in the fortran interface? >> >> Il Mer 9 Nov 2022, 13:52 Matthew Knepley ha scritto: >> >>> Fields are numbered from 0. >>> >>> Thanks, >>> >>> Matt >>> >>> On Wed, Nov 9, 2022 at 2:20 AM Edoardo alinovi < >>> edoardo.alinovi at gmail.com> wrote: >>> >>>> Hello guys, >>>> >>>> I am getting this error while using fieldsplit: >>>> >>>> [3]PETSC ERROR: --------------------- Error Message >>>> -------------------------------------------------------------- >>>> >>>> *[3]PETSC ERROR: Nonconforming object sizes[3]PETSC ERROR: Local column >>>> sizes 6132 do not add up to total number of columns 9200* >>>> [3]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>> shooting. >>>> [3]PETSC ERROR: Petsc Development GIT revision: >>>> v3.18.1-191-g32ed6ae2ff2 GIT Date: 2022-11-08 12:22:17 -0500 >>>> [3]PETSC ERROR: flubio_coupled on a gnu named alienware by edo Wed Nov >>>> 9 08:16:29 2022 >>>> [3]PETSC ERROR: Configure options PETSC_ARCH=gnu FOPTFLAGS=-O3 >>>> COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -download-fblaslapack=1 >>>> -download-superlu_dist -download-mumps -download-hypre -download-metis >>>> -download-parmetis -download-scalapack -download-ml -download-slepc >>>> -download-hpddm -download-cmake >>>> -with-mpi-dir=/home/edo/software/openmpi-4.1.1/build/ >>>> [3]PETSC ERROR: #1 MatCreateSubMatrix_MPIBAIJ_Private() at >>>> /home/edo/software/petsc/src/mat/impls/baij/mpi/mpibaij.c:1987 >>>> [3]PETSC ERROR: #2 MatCreateSubMatrix_MPIBAIJ() at >>>> /home/edo/software/petsc/src/mat/impls/baij/mpi/mpibaij.c:1911 >>>> [3]PETSC ERROR: #3 MatCreateSubMatrix() at >>>> /home/edo/software/petsc/src/mat/interface/matrix.c:8340 >>>> [3]PETSC ERROR: #4 PCSetUp_FieldSplit() at >>>> /home/edo/software/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:657 >>>> [3]PETSC ERROR: #5 PCSetUp() at >>>> /home/edo/software/petsc/src/ksp/pc/interface/precon.c:994 >>>> [3]PETSC ERROR: #6 KSPSetUp() at >>>> /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:406 >>>> [3]PETSC ERROR: #7 KSPSolve_Private() at >>>> /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:825 >>>> [3]PETSC ERROR: #8 KSPSolve() at >>>> /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>>> >>>> Do you have any ideas? Probably something missing in my brief >>>> implementation here: >>>> >>>> >>>> >>>> >>>> * call PCSetType(mypc, PCFIELDSPLIT, ierr) call >>>> PCFieldSplitSetBlockSize(mypc, 4-bdim, ierr) * >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> * !2D, 3x3 block if(bdim==1) then >>>> ufields(1) = 0 ufields(2) = 1 pfields(1) = 2 >>>> call PCFieldSplitSetFields(mypc, "u", 2, ufields, ufields, >>>> ierr) call PCFieldSplitSetFields(mypc, "p", 1, pfields, >>>> pfields, ierr) ! 3D 4x4 block else >>>> ufields(1) = 0 ufields(2) = 1 ufields(3) = 2 >>>> pfields(1) = 3 call >>>> PCFieldSplitSetFields(mypc, "u", 3, ufields, ufields, ierr) >>>> call PCFieldSplitSetFields(mypc, "p", 1, pfields, pfields, ierr) >>>> endif ! Field split type ADDITIVE, MULTIPLICATIVE >>>> (default), SYMMETRIC_MULTIPLICATIVE, SPECIAL, SCHUR call >>>> PCFieldSplitSetType(mypc, PC_COMPOSITE_SCHUR, ierr)* >>>> >>>> Thanks for the help! >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Wed Nov 9 07:09:15 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Wed, 9 Nov 2022 14:09:15 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: Sure, I'll try on a 3x3 cavity. How can I print the ISs? Il Mer 9 Nov 2022, 14:07 Matthew Knepley ha scritto: > On Wed, Nov 9, 2022 at 7:57 AM Edoardo alinovi > wrote: > >> To be clear, >> >> You are suggesting to use ufields(0)=0, ufields(1)=1 and so on? >> > > I think you are right. Those should start from 1. However, your ISes do > not seem to cover > the whole matrix. Can you start with a very small problem so that you can > print them to > the screen? > > Thanks, > > Matt > > >> Il Mer 9 Nov 2022, 13:54 Edoardo alinovi ha >> scritto: >> >>> Even in the fortran interface? >>> >>> Il Mer 9 Nov 2022, 13:52 Matthew Knepley ha scritto: >>> >>>> Fields are numbered from 0. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> On Wed, Nov 9, 2022 at 2:20 AM Edoardo alinovi < >>>> edoardo.alinovi at gmail.com> wrote: >>>> >>>>> Hello guys, >>>>> >>>>> I am getting this error while using fieldsplit: >>>>> >>>>> [3]PETSC ERROR: --------------------- Error Message >>>>> -------------------------------------------------------------- >>>>> >>>>> *[3]PETSC ERROR: Nonconforming object sizes[3]PETSC ERROR: Local >>>>> column sizes 6132 do not add up to total number of columns 9200* >>>>> [3]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>>> shooting. >>>>> [3]PETSC ERROR: Petsc Development GIT revision: >>>>> v3.18.1-191-g32ed6ae2ff2 GIT Date: 2022-11-08 12:22:17 -0500 >>>>> [3]PETSC ERROR: flubio_coupled on a gnu named alienware by edo Wed Nov >>>>> 9 08:16:29 2022 >>>>> [3]PETSC ERROR: Configure options PETSC_ARCH=gnu FOPTFLAGS=-O3 >>>>> COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -download-fblaslapack=1 >>>>> -download-superlu_dist -download-mumps -download-hypre -download-metis >>>>> -download-parmetis -download-scalapack -download-ml -download-slepc >>>>> -download-hpddm -download-cmake >>>>> -with-mpi-dir=/home/edo/software/openmpi-4.1.1/build/ >>>>> [3]PETSC ERROR: #1 MatCreateSubMatrix_MPIBAIJ_Private() at >>>>> /home/edo/software/petsc/src/mat/impls/baij/mpi/mpibaij.c:1987 >>>>> [3]PETSC ERROR: #2 MatCreateSubMatrix_MPIBAIJ() at >>>>> /home/edo/software/petsc/src/mat/impls/baij/mpi/mpibaij.c:1911 >>>>> [3]PETSC ERROR: #3 MatCreateSubMatrix() at >>>>> /home/edo/software/petsc/src/mat/interface/matrix.c:8340 >>>>> [3]PETSC ERROR: #4 PCSetUp_FieldSplit() at >>>>> /home/edo/software/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:657 >>>>> [3]PETSC ERROR: #5 PCSetUp() at >>>>> /home/edo/software/petsc/src/ksp/pc/interface/precon.c:994 >>>>> [3]PETSC ERROR: #6 KSPSetUp() at >>>>> /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:406 >>>>> [3]PETSC ERROR: #7 KSPSolve_Private() at >>>>> /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:825 >>>>> [3]PETSC ERROR: #8 KSPSolve() at >>>>> /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>>>> >>>>> Do you have any ideas? Probably something missing in my brief >>>>> implementation here: >>>>> >>>>> >>>>> >>>>> >>>>> * call PCSetType(mypc, PCFIELDSPLIT, ierr) >>>>> call PCFieldSplitSetBlockSize(mypc, 4-bdim, ierr) * >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> * !2D, 3x3 block if(bdim==1) then >>>>> ufields(1) = 0 ufields(2) = 1 pfields(1) = >>>>> 2 call PCFieldSplitSetFields(mypc, "u", 2, ufields, ufields, >>>>> ierr) call PCFieldSplitSetFields(mypc, "p", 1, pfields, >>>>> pfields, ierr) ! 3D 4x4 block else >>>>> ufields(1) = 0 ufields(2) = 1 ufields(3) = 2 >>>>> pfields(1) = 3 call >>>>> PCFieldSplitSetFields(mypc, "u", 3, ufields, ufields, ierr) >>>>> call PCFieldSplitSetFields(mypc, "p", 1, pfields, pfields, ierr) >>>>> endif ! Field split type ADDITIVE, MULTIPLICATIVE >>>>> (default), SYMMETRIC_MULTIPLICATIVE, SPECIAL, SCHUR call >>>>> PCFieldSplitSetType(mypc, PC_COMPOSITE_SCHUR, ierr)* >>>>> >>>>> Thanks for the help! >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 9 07:12:22 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 9 Nov 2022 08:12:22 -0500 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: On Wed, Nov 9, 2022 at 8:09 AM Edoardo alinovi wrote: > Sure, > > I'll try on a 3x3 cavity. How can I print the ISs? > ISView() or PetscObjectViewFromOptions() Thanks, Matt > Il Mer 9 Nov 2022, 14:07 Matthew Knepley ha scritto: > >> On Wed, Nov 9, 2022 at 7:57 AM Edoardo alinovi >> wrote: >> >>> To be clear, >>> >>> You are suggesting to use ufields(0)=0, ufields(1)=1 and so on? >>> >> >> I think you are right. Those should start from 1. However, your ISes do >> not seem to cover >> the whole matrix. Can you start with a very small problem so that you can >> print them to >> the screen? >> >> Thanks, >> >> Matt >> >> >>> Il Mer 9 Nov 2022, 13:54 Edoardo alinovi ha >>> scritto: >>> >>>> Even in the fortran interface? >>>> >>>> Il Mer 9 Nov 2022, 13:52 Matthew Knepley ha >>>> scritto: >>>> >>>>> Fields are numbered from 0. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> On Wed, Nov 9, 2022 at 2:20 AM Edoardo alinovi < >>>>> edoardo.alinovi at gmail.com> wrote: >>>>> >>>>>> Hello guys, >>>>>> >>>>>> I am getting this error while using fieldsplit: >>>>>> >>>>>> [3]PETSC ERROR: --------------------- Error Message >>>>>> -------------------------------------------------------------- >>>>>> >>>>>> *[3]PETSC ERROR: Nonconforming object sizes[3]PETSC ERROR: Local >>>>>> column sizes 6132 do not add up to total number of columns 9200* >>>>>> [3]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>>>> shooting. >>>>>> [3]PETSC ERROR: Petsc Development GIT revision: >>>>>> v3.18.1-191-g32ed6ae2ff2 GIT Date: 2022-11-08 12:22:17 -0500 >>>>>> [3]PETSC ERROR: flubio_coupled on a gnu named alienware by edo Wed >>>>>> Nov 9 08:16:29 2022 >>>>>> [3]PETSC ERROR: Configure options PETSC_ARCH=gnu FOPTFLAGS=-O3 >>>>>> COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -download-fblaslapack=1 >>>>>> -download-superlu_dist -download-mumps -download-hypre -download-metis >>>>>> -download-parmetis -download-scalapack -download-ml -download-slepc >>>>>> -download-hpddm -download-cmake >>>>>> -with-mpi-dir=/home/edo/software/openmpi-4.1.1/build/ >>>>>> [3]PETSC ERROR: #1 MatCreateSubMatrix_MPIBAIJ_Private() at >>>>>> /home/edo/software/petsc/src/mat/impls/baij/mpi/mpibaij.c:1987 >>>>>> [3]PETSC ERROR: #2 MatCreateSubMatrix_MPIBAIJ() at >>>>>> /home/edo/software/petsc/src/mat/impls/baij/mpi/mpibaij.c:1911 >>>>>> [3]PETSC ERROR: #3 MatCreateSubMatrix() at >>>>>> /home/edo/software/petsc/src/mat/interface/matrix.c:8340 >>>>>> [3]PETSC ERROR: #4 PCSetUp_FieldSplit() at >>>>>> /home/edo/software/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:657 >>>>>> [3]PETSC ERROR: #5 PCSetUp() at >>>>>> /home/edo/software/petsc/src/ksp/pc/interface/precon.c:994 >>>>>> [3]PETSC ERROR: #6 KSPSetUp() at >>>>>> /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:406 >>>>>> [3]PETSC ERROR: #7 KSPSolve_Private() at >>>>>> /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:825 >>>>>> [3]PETSC ERROR: #8 KSPSolve() at >>>>>> /home/edo/software/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>>>>> >>>>>> Do you have any ideas? Probably something missing in my brief >>>>>> implementation here: >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> * call PCSetType(mypc, PCFIELDSPLIT, ierr) >>>>>> call PCFieldSplitSetBlockSize(mypc, 4-bdim, ierr) * >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> * !2D, 3x3 block if(bdim==1) then >>>>>> ufields(1) = 0 ufields(2) = 1 pfields(1) = >>>>>> 2 call PCFieldSplitSetFields(mypc, "u", 2, ufields, ufields, >>>>>> ierr) call PCFieldSplitSetFields(mypc, "p", 1, pfields, >>>>>> pfields, ierr) ! 3D 4x4 block else >>>>>> ufields(1) = 0 ufields(2) = 1 ufields(3) = 2 >>>>>> pfields(1) = 3 call >>>>>> PCFieldSplitSetFields(mypc, "u", 3, ufields, ufields, ierr) >>>>>> call PCFieldSplitSetFields(mypc, "p", 1, pfields, pfields, ierr) >>>>>> endif ! Field split type ADDITIVE, MULTIPLICATIVE >>>>>> (default), SYMMETRIC_MULTIPLICATIVE, SPECIAL, SCHUR call >>>>>> PCFieldSplitSetType(mypc, PC_COMPOSITE_SCHUR, ierr)* >>>>>> >>>>>> Thanks for the help! >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Wed Nov 9 08:04:51 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Wed, 9 Nov 2022 15:04:51 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: So my cavity has 3x3=9 cells, each cells as a 3x3 block. I get the same error: [0]PETSC ERROR: Local column sizes 6 do not add up to total number of columns 9 However I do not define any IS, I just pass an array to PCFieldSplitSetFields() and thus I do not know how to plot them... -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 9 08:13:48 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 9 Nov 2022 09:13:48 -0500 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: On Wed, Nov 9, 2022 at 9:05 AM Edoardo alinovi wrote: > So my cavity has 3x3=9 cells, each cells as a 3x3 block. I get the same > error: [0]PETSC ERROR: Local column sizes 6 do not add up to total number > of columns 9 > > However I do not define any IS, I just pass an array > to PCFieldSplitSetFields() and thus I do not know how to plot them... > Okay, just send in the small example and I will figure out why you are getting this. Thanks, Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Wed Nov 9 08:16:53 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Wed, 9 Nov 2022 15:16:53 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: Thanks, the stuff I am doing is within my code, so I am not sure you can reproduce it. I am just doing this: call PCSetType(mypc, PCFIELDSPLIT, ierr) call PCFieldSplitSetBlockSize(mypc, 4-bdim, ierr) ufields(1) = 0 ufields(2) = 1 pfields(1) = 2 call PCFieldSplitSetFields(mypc, "u", 2, ufields, ufields, ierr) call PCFieldSplitSetFields(mypc, "p", 1, pfields, pfields, ierr) On an MPIBAIJ matrix with bs = 3. -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Wed Nov 9 08:19:36 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Wed, 9 Nov 2022 15:19:36 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: I am copying this example: https://petsc.org/release/src/ksp/ksp/tutorials/ex42.c.html lines 2040'2042 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 9 08:19:45 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 9 Nov 2022 09:19:45 -0500 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: On Wed, Nov 9, 2022 at 9:17 AM Edoardo alinovi wrote: > Thanks, > > the stuff I am doing is within my code, so I am not sure you can reproduce > it. > How about just making a small code that fills those nonzeros with 1s. We just want to figure out why your sparsity pattern is not working. We have lots of working PCFIELDSPLIT examples, but I don't think it would be useful to show you those. Thanks, Matt > I am just doing this: > > call PCSetType(mypc, PCFIELDSPLIT, ierr) > > call PCFieldSplitSetBlockSize(mypc, 4-bdim, ierr) > > ufields(1) = 0 > ufields(2) = 1 > pfields(1) = 2 > call PCFieldSplitSetFields(mypc, "u", 2, ufields, ufields, > ierr) > call PCFieldSplitSetFields(mypc, "p", 1, pfields, pfields, > ierr) > > On an MPIBAIJ matrix with bs = 3. > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 9 08:20:27 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 9 Nov 2022 09:20:27 -0500 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: That would be fine. Thanks, Matt On Wed, Nov 9, 2022 at 9:19 AM Edoardo alinovi wrote: > I am copying this example: > https://petsc.org/release/src/ksp/ksp/tutorials/ex42.c.html > lines > 2040'2042 > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Wed Nov 9 08:24:35 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Wed, 9 Nov 2022 15:24:35 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: The fact it is telling me 6 instead of 9, makes me think it is getting just the first split for "u" and not the second one for "p" that would lead to 9. -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 9 09:04:12 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 9 Nov 2022 10:04:12 -0500 Subject: [petsc-users] Reference element in DMPlexComputeCellGeometryAffineFEM In-Reply-To: References: Message-ID: On Tue, Nov 8, 2022 at 9:14 PM Blaise Bourdin wrote: > Hi, > > What reference simplex is DMPlexComputeCellGeometryAffineFEM using in 2 > and 3D? > I am used to computing my shape functions on the unit simplex (vertices at > the origin and each e_i), but it does not look to be the reference simplex > in this function: > > In 3D, for the unit simplex with vertices at (0,0,0) (1,0,0) (0,1,0) > (0,0,1) (in this order), I get J = 1 / 2 . [[-1,-1,-1],[1,0,0],[0,0,1]] and > v0 = [0,0,1] > > In 2D, for the unit simplex with vertices at (0,0), (1,0), and (0,1), I > get J = 1 / 2. I and v0 = [0,0], which does not make any sense to me (I was > assuming that the 2D reference simplex had vertices at (-1,-1), (1, -1) and > (-1,1), but if this were the case, v0 would not be 0). > > I can build a simple example with meshes consisting only of the unit > simplex in 2D and 3D if that would help. > I need to rewrite the documentation on geometry, but I was waiting until I rewrite the geometry calculations to fit into libCEED. Toby found a nice way to express them in BLAS form which I need to push through everything. I always think of operating on the cell with the first vertex at the origin (I think it is easier), so I have a xi0 that translates the first vertex of the reference to the origin, and a v0 that translates the first vertex of the real cell to the origin. You can see this here https://gitlab.com/petsc/petsc/-/blob/main/include/petsc/private/petscfeimpl.h#L251 This explains the 2D result. I cannot understand your 3D result, unless the vertices are in another order. Thanks, Matt > Regards, > Blaise > > > > ? > Canada Research Chair in Mathematical and Computational Aspects of Solid > Mechanics (Tier 1) > Professor, Department of Mathematics & Statistics > Hamilton Hall room 409A, McMaster University > 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada > https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bourdin at mcmaster.ca Wed Nov 9 09:46:55 2022 From: bourdin at mcmaster.ca (Blaise Bourdin) Date: Wed, 9 Nov 2022 15:46:55 +0000 Subject: [petsc-users] Reference element in DMPlexComputeCellGeometryAffineFEM In-Reply-To: References: Message-ID: <1223C850-C305-4475-BBF0-F907C8739C1C@mcmaster.ca> An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: TestDMPlexComputeCellGeometryAffineFEM.c Type: application/octet-stream Size: 1600 bytes Desc: TestDMPlexComputeCellGeometryAffineFEM.c URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 1Tet.gen Type: application/octet-stream Size: 12375 bytes Desc: 1Tet.gen URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 1Tri.gen Type: application/octet-stream Size: 11851 bytes Desc: 1Tri.gen URL: From edoardo.alinovi at gmail.com Wed Nov 9 11:00:21 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Wed, 9 Nov 2022 18:00:21 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: Hi Matt, it took a bit more than 1s, but I can reproduce the error in the attached file. To compile: *mpifort -L$PETSC_DIR/$PETSC_ARCH/lib -lpetsc -fdefault-real-8 -o test test.F90 -I$PETSC_DIR/include -I$PETSC_DIR/$PETSC_ARCH/include* Please run it in serial as I have hardcoded some dimensions to code this up faster. Thank you! -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: test.F90 Type: application/octet-stream Size: 2773 bytes Desc: not available URL: From alexlindsay239 at gmail.com Wed Nov 9 12:45:38 2022 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Wed, 9 Nov 2022 10:45:38 -0800 Subject: [petsc-users] Local columns of A10 do not equal local rows of A00 In-Reply-To: References: Message-ID: Ok, I've figured out that we are definitely messing something up in our split index set generation. For process 1 our Jac/PMat local size is 14307, but split 0 IS local size is 4129 and split 1 IS local size is 10170, so that leaves us 8 dofs short. I know now where I need to dig in our field decomposition. Thanks Matt for helping me process through this stuff! On Tue, Nov 8, 2022 at 4:53 PM Alexander Lindsay wrote: > This is from our DMCreateFieldDecomposition_Moose routine. The IS size on > process 1 (which is the process from which I took the error in the original > post) is reported as 4129 which is consistent with the row size of A00. > > Split '0' has local size 4129 on processor 1 > Split '0' has local size 4484 on processor 6 > Split '0' has local size 4471 on processor 12 > Split '0' has local size 4040 on processor 14 > Split '0' has local size 3594 on processor 20 > Split '0' has local size 4423 on processor 22 > Split '0' has local size 2791 on processor 27 > Split '0' has local size 3014 on processor 29 > Split '0' has local size 3183 on processor 30 > Split '0' has local size 3328 on processor 3 > Split '0' has local size 4689 on processor 4 > Split '0' has local size 8016 on processor 8 > Split '0' has local size 6367 on processor 10 > Split '0' has local size 5973 on processor 17 > Split '0' has local size 4431 on processor 18 > Split '0' has local size 7564 on processor 25 > Split '0' has local size 12504 on processor 9 > Split '0' has local size 10081 on processor 11 > Split '0' has local size 13808 on processor 24 > Split '0' has local size 14049 on processor 31 > Split '0' has local size 15324 on processor 7 > Split '0' has local size 15337 on processor 15 > Split '0' has local size 14849 on processor 19 > Split '0' has local size 15660 on processor 23 > Split '0' has local size 14728 on processor 26 > Split '0' has local size 15724 on processor 28 > Split '0' has local size 17249 on processor 5 > Split '0' has local size 15519 on processor 13 > Split '0' has local size 16511 on processor 16 > Split '0' has local size 16496 on processor 21 > Split '0' has local size 18291 on processor 2 > Split '0' has local size 18042 on processor 0 > > On Mon, Nov 7, 2022 at 6:04 PM Matthew Knepley wrote: > >> On Mon, Nov 7, 2022 at 5:48 PM Alexander Lindsay < >> alexlindsay239 at gmail.com> wrote: >> >>> My understanding looking at PCFieldSplitSetDefaults is that our >>> implementation of `createfielddecomposition` should get called, we'll set >>> `fields` and then (ignoring possible user setting of >>> -pc_fieldsplit_%D_fields flag) PCFieldSplitSetIS will get called with >>> whatever we did to `fields`. So yea I guess that just looking over that I >>> would assume we're not supplying two different index sets for rows and >>> columns, or put more precisely we (MOOSE) are not really afforded the >>> opportunity to. But my interpretation could very well be wrong. >>> >> >> Oh wait. I read the error message again. It does not say that the whole >> selection is rectangular. It says >> >> Local columns of A10 4137 do not equal local rows of A00 4129 >> >> So this is a parallel partitioning thing. Since A00 has 4129 local rows, >> it should have this many columns as well. >> However A10 has 4137 local columns. How big is IS_0, on each process, >> that you pass in to PCFIELDSPLIT? >> >> Thanks, >> >> Matt >> >> >>> On Mon, Nov 7, 2022 at 12:33 PM Matthew Knepley >>> wrote: >>> >>>> On Mon, Nov 7, 2022 at 2:09 PM Alexander Lindsay < >>>> alexlindsay239 at gmail.com> wrote: >>>> >>>>> The libMesh/MOOSE specific code that identifies dof indices for >>>>> ISCreateGeneral is in DMooseGetEmbedding_Private. I can share that function >>>>> (it's quite long) or more details if that could be helpful. >>>>> >>>> >>>> Sorry, I should have written more. The puzzling thing for me is that >>>> somehow it looks like the row and column index sets are not the same. I did >>>> not think >>>> PCFIELDSPLIT could do that. The PCFieldSplitSetIS() interface does not >>>> allow it. I was wondering how you were setting the ISes. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> On Mon, Nov 7, 2022 at 10:55 AM Alexander Lindsay < >>>>> alexlindsay239 at gmail.com> wrote: >>>>> >>>>>> I'm not sure exactly what you mean, but I'll try to give more >>>>>> details. We have our own DM class (DM_Moose) and we set our own field and >>>>>> domain decomposition routines: >>>>>> >>>>>> dm->ops->createfielddecomposition = >>>>>> DMCreateFieldDecomposition_Moose; >>>>>> >>>>>> dm->ops->createdomaindecomposition = >>>>>> DMCreateDomainDecomposition_Moose; >>>>>> >>>>>> >>>>>> The field and domain decomposition routines are as follows (can see >>>>>> also at >>>>>> https://github.com/idaholab/moose/blob/next/framework/src/utils/PetscDMMoose.C >>>>>> ): >>>>>> >>>>>> static PetscErrorCode >>>>>> DMCreateFieldDecomposition_Moose( >>>>>> DM dm, PetscInt * len, char *** namelist, IS ** islist, DM ** >>>>>> dmlist) >>>>>> { >>>>>> PetscErrorCode ierr; >>>>>> DM_Moose * dmm = (DM_Moose *)(dm->data); >>>>>> >>>>>> PetscFunctionBegin; >>>>>> /* Only called after DMSetUp(). */ >>>>>> if (!dmm->_splitlocs) >>>>>> PetscFunctionReturn(0); >>>>>> *len = dmm->_splitlocs->size(); >>>>>> if (namelist) >>>>>> { >>>>>> ierr = PetscMalloc(*len * sizeof(char *), namelist); >>>>>> CHKERRQ(ierr); >>>>>> } >>>>>> if (islist) >>>>>> { >>>>>> ierr = PetscMalloc(*len * sizeof(IS), islist); >>>>>> CHKERRQ(ierr); >>>>>> } >>>>>> if (dmlist) >>>>>> { >>>>>> ierr = PetscMalloc(*len * sizeof(DM), dmlist); >>>>>> CHKERRQ(ierr); >>>>>> } >>>>>> for (const auto & dit : *(dmm->_splitlocs)) >>>>>> { >>>>>> unsigned int d = dit.second; >>>>>> std::string dname = dit.first; >>>>>> DM_Moose::SplitInfo & dinfo = (*dmm->_splits)[dname]; >>>>>> if (!dinfo._dm) >>>>>> { >>>>>> ierr = DMCreateMoose(((PetscObject)dm)->comm, *dmm->_nl, >>>>>> &dinfo._dm); >>>>>> CHKERRQ(ierr); >>>>>> ierr = PetscObjectSetOptionsPrefix((PetscObject)dinfo._dm, >>>>>> ((PetscObject)dm)->prefix); >>>>>> CHKERRQ(ierr); >>>>>> std::string suffix = std::string("fieldsplit_") + dname + "_"; >>>>>> ierr = PetscObjectAppendOptionsPrefix((PetscObject)dinfo._dm, >>>>>> suffix.c_str()); >>>>>> CHKERRQ(ierr); >>>>>> } >>>>>> ierr = DMSetFromOptions(dinfo._dm); >>>>>> CHKERRQ(ierr); >>>>>> ierr = DMSetUp(dinfo._dm); >>>>>> CHKERRQ(ierr); >>>>>> if (namelist) >>>>>> { >>>>>> ierr = PetscStrallocpy(dname.c_str(), (*namelist) + d); >>>>>> CHKERRQ(ierr); >>>>>> } >>>>>> if (islist) >>>>>> { >>>>>> if (!dinfo._rembedding) >>>>>> { >>>>>> IS dembedding, lembedding; >>>>>> ierr = DMMooseGetEmbedding_Private(dinfo._dm, &dembedding); >>>>>> CHKERRQ(ierr); >>>>>> if (dmm->_embedding) >>>>>> { >>>>>> // Create a relative embedding into the parent's index >>>>>> space. >>>>>> ierr = ISEmbed(dembedding, dmm->_embedding, PETSC_TRUE, >>>>>> &lembedding); >>>>>> CHKERRQ(ierr); >>>>>> const PetscInt * lindices; >>>>>> PetscInt len, dlen, llen, *rindices, off, i; >>>>>> ierr = ISGetLocalSize(dembedding, &dlen); >>>>>> CHKERRQ(ierr); >>>>>> ierr = ISGetLocalSize(lembedding, &llen); >>>>>> CHKERRQ(ierr); >>>>>> if (llen != dlen) >>>>>> SETERRQ1(((PetscObject)dm)->comm, PETSC_ERR_PLIB, "Failed >>>>>> to embed split %D", d); >>>>>> ierr = ISDestroy(&dembedding); >>>>>> CHKERRQ(ierr); >>>>>> // Convert local embedding to global (but still relative) >>>>>> embedding >>>>>> ierr = PetscMalloc(llen * sizeof(PetscInt), &rindices); >>>>>> CHKERRQ(ierr); >>>>>> ierr = ISGetIndices(lembedding, &lindices); >>>>>> CHKERRQ(ierr); >>>>>> ierr = PetscMemcpy(rindices, lindices, llen * >>>>>> sizeof(PetscInt)); >>>>>> CHKERRQ(ierr); >>>>>> ierr = ISDestroy(&lembedding); >>>>>> CHKERRQ(ierr); >>>>>> // We could get the index offset from a corresponding >>>>>> global vector, but subDMs don't yet >>>>>> // have global vectors >>>>>> ierr = ISGetLocalSize(dmm->_embedding, &len); >>>>>> CHKERRQ(ierr); >>>>>> >>>>>> ierr = MPI_Scan(&len, >>>>>> &off, >>>>>> 1, >>>>>> #ifdef PETSC_USE_64BIT_INDICES >>>>>> MPI_LONG_LONG_INT, >>>>>> #else >>>>>> MPI_INT, >>>>>> #endif >>>>>> MPI_SUM, >>>>>> ((PetscObject)dm)->comm); >>>>>> CHKERRQ(ierr); >>>>>> >>>>>> off -= len; >>>>>> for (i = 0; i < llen; ++i) >>>>>> rindices[i] += off; >>>>>> ierr = ISCreateGeneral( >>>>>> ((PetscObject)dm)->comm, llen, rindices, >>>>>> PETSC_OWN_POINTER, &(dinfo._rembedding)); >>>>>> CHKERRQ(ierr); >>>>>> } >>>>>> else >>>>>> { >>>>>> dinfo._rembedding = dembedding; >>>>>> } >>>>>> } >>>>>> ierr = PetscObjectReference((PetscObject)(dinfo._rembedding)); >>>>>> CHKERRQ(ierr); >>>>>> (*islist)[d] = dinfo._rembedding; >>>>>> } >>>>>> if (dmlist) >>>>>> { >>>>>> ierr = PetscObjectReference((PetscObject)dinfo._dm); >>>>>> CHKERRQ(ierr); >>>>>> (*dmlist)[d] = dinfo._dm; >>>>>> } >>>>>> } >>>>>> PetscFunctionReturn(0); >>>>>> } >>>>>> >>>>>> static PetscErrorCode >>>>>> DMCreateDomainDecomposition_Moose( >>>>>> DM dm, PetscInt * len, char *** namelist, IS ** innerislist, IS >>>>>> ** outerislist, DM ** dmlist) >>>>>> { >>>>>> PetscErrorCode ierr; >>>>>> >>>>>> PetscFunctionBegin; >>>>>> /* Use DMCreateFieldDecomposition_Moose() to obtain everything but >>>>>> outerislist, which is currently >>>>>> * PETSC_NULL. */ >>>>>> if (outerislist) >>>>>> *outerislist = PETSC_NULL; /* FIX: allow mesh-based overlap. */ >>>>>> ierr = DMCreateFieldDecomposition_Moose(dm, len, namelist, >>>>>> innerislist, dmlist); >>>>>> CHKERRQ(ierr); >>>>>> PetscFunctionReturn(0); >>>>>> } >>>>>> >>>>>> >>>>>> >>>>>> On Thu, Nov 3, 2022 at 5:19 PM Matthew Knepley >>>>>> wrote: >>>>>> >>>>>>> On Thu, Nov 3, 2022 at 7:52 PM Alexander Lindsay < >>>>>>> alexlindsay239 at gmail.com> wrote: >>>>>>> >>>>>>>> I have errors on quite a few (but not all) processes of the like >>>>>>>> >>>>>>>> [1]PETSC ERROR: --------------------- Error Message >>>>>>>> -------------------------------------------------------------- >>>>>>>> [1]PETSC ERROR: Nonconforming object sizes >>>>>>>> [1]PETSC ERROR: Local columns of A10 4137 do not equal local rows >>>>>>>> of A00 4129 >>>>>>>> >>>>>>>> when performing field splits. We (MOOSE) have some code for >>>>>>>> identifying the index sets for each split. However, the code was written by >>>>>>>> some authors who are no longer with us. Normally I would chase this down in >>>>>>>> a debugger, but this error only seems to crop up for pretty complex and >>>>>>>> large meshes. If anyone has an idea for what we might be doing wrong, that >>>>>>>> might help me chase this down faster. I guess intuitively I'm pretty >>>>>>>> perplexed that we could get ourselves into this pickle as it almost appears >>>>>>>> that we have two different local dof index counts for a given block (0 in >>>>>>>> this case). More background, if helpful, can be found in >>>>>>>> https://github.com/idaholab/moose/issues/22359 as well as >>>>>>>> https://github.com/idaholab/moose/discussions/22468. >>>>>>>> >>>>>>> >>>>>>> How are you specifying the blocks? I would not have thought this was >>>>>>> possible. >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> I should note that we are currently running with 3.16.6 as our >>>>>>>> PETSc submodule hash (we are talking about updating to 3.18 soon). >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>> >>>>>>> >>>>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 9 14:45:05 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 9 Nov 2022 15:45:05 -0500 Subject: [petsc-users] Local columns of A10 do not equal local rows of A00 In-Reply-To: References: Message-ID: On Wed, Nov 9, 2022 at 1:45 PM Alexander Lindsay wrote: > Ok, I've figured out that we are definitely messing something up in our > split index set generation. For process 1 our Jac/PMat local size is 14307, > but split 0 IS local size is 4129 and split 1 IS local size is 10170, so > that leaves us 8 dofs short. > > I know now where I need to dig in our field decomposition. Thanks Matt for > helping me process through this stuff! > Cool. The one piece of advice I have is to make the problem ruthlessly small. Even if it seems hard, it is worth the time to get it down to the size you can print to the screen. Thanks, Matt > On Tue, Nov 8, 2022 at 4:53 PM Alexander Lindsay > wrote: > >> This is from our DMCreateFieldDecomposition_Moose routine. The IS size on >> process 1 (which is the process from which I took the error in the original >> post) is reported as 4129 which is consistent with the row size of A00. >> >> Split '0' has local size 4129 on processor 1 >> Split '0' has local size 4484 on processor 6 >> Split '0' has local size 4471 on processor 12 >> Split '0' has local size 4040 on processor 14 >> Split '0' has local size 3594 on processor 20 >> Split '0' has local size 4423 on processor 22 >> Split '0' has local size 2791 on processor 27 >> Split '0' has local size 3014 on processor 29 >> Split '0' has local size 3183 on processor 30 >> Split '0' has local size 3328 on processor 3 >> Split '0' has local size 4689 on processor 4 >> Split '0' has local size 8016 on processor 8 >> Split '0' has local size 6367 on processor 10 >> Split '0' has local size 5973 on processor 17 >> Split '0' has local size 4431 on processor 18 >> Split '0' has local size 7564 on processor 25 >> Split '0' has local size 12504 on processor 9 >> Split '0' has local size 10081 on processor 11 >> Split '0' has local size 13808 on processor 24 >> Split '0' has local size 14049 on processor 31 >> Split '0' has local size 15324 on processor 7 >> Split '0' has local size 15337 on processor 15 >> Split '0' has local size 14849 on processor 19 >> Split '0' has local size 15660 on processor 23 >> Split '0' has local size 14728 on processor 26 >> Split '0' has local size 15724 on processor 28 >> Split '0' has local size 17249 on processor 5 >> Split '0' has local size 15519 on processor 13 >> Split '0' has local size 16511 on processor 16 >> Split '0' has local size 16496 on processor 21 >> Split '0' has local size 18291 on processor 2 >> Split '0' has local size 18042 on processor 0 >> >> On Mon, Nov 7, 2022 at 6:04 PM Matthew Knepley wrote: >> >>> On Mon, Nov 7, 2022 at 5:48 PM Alexander Lindsay < >>> alexlindsay239 at gmail.com> wrote: >>> >>>> My understanding looking at PCFieldSplitSetDefaults is that our >>>> implementation of `createfielddecomposition` should get called, we'll set >>>> `fields` and then (ignoring possible user setting of >>>> -pc_fieldsplit_%D_fields flag) PCFieldSplitSetIS will get called with >>>> whatever we did to `fields`. So yea I guess that just looking over that I >>>> would assume we're not supplying two different index sets for rows and >>>> columns, or put more precisely we (MOOSE) are not really afforded the >>>> opportunity to. But my interpretation could very well be wrong. >>>> >>> >>> Oh wait. I read the error message again. It does not say that the whole >>> selection is rectangular. It says >>> >>> Local columns of A10 4137 do not equal local rows of A00 4129 >>> >>> So this is a parallel partitioning thing. Since A00 has 4129 local rows, >>> it should have this many columns as well. >>> However A10 has 4137 local columns. How big is IS_0, on each process, >>> that you pass in to PCFIELDSPLIT? >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> On Mon, Nov 7, 2022 at 12:33 PM Matthew Knepley >>>> wrote: >>>> >>>>> On Mon, Nov 7, 2022 at 2:09 PM Alexander Lindsay < >>>>> alexlindsay239 at gmail.com> wrote: >>>>> >>>>>> The libMesh/MOOSE specific code that identifies dof indices for >>>>>> ISCreateGeneral is in DMooseGetEmbedding_Private. I can share that function >>>>>> (it's quite long) or more details if that could be helpful. >>>>>> >>>>> >>>>> Sorry, I should have written more. The puzzling thing for me is that >>>>> somehow it looks like the row and column index sets are not the same. I did >>>>> not think >>>>> PCFIELDSPLIT could do that. The PCFieldSplitSetIS() interface does not >>>>> allow it. I was wondering how you were setting the ISes. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> On Mon, Nov 7, 2022 at 10:55 AM Alexander Lindsay < >>>>>> alexlindsay239 at gmail.com> wrote: >>>>>> >>>>>>> I'm not sure exactly what you mean, but I'll try to give more >>>>>>> details. We have our own DM class (DM_Moose) and we set our own field and >>>>>>> domain decomposition routines: >>>>>>> >>>>>>> dm->ops->createfielddecomposition = >>>>>>> DMCreateFieldDecomposition_Moose; >>>>>>> >>>>>>> dm->ops->createdomaindecomposition = >>>>>>> DMCreateDomainDecomposition_Moose; >>>>>>> >>>>>>> >>>>>>> The field and domain decomposition routines are as follows (can see >>>>>>> also at >>>>>>> https://github.com/idaholab/moose/blob/next/framework/src/utils/PetscDMMoose.C >>>>>>> ): >>>>>>> >>>>>>> static PetscErrorCode >>>>>>> DMCreateFieldDecomposition_Moose( >>>>>>> DM dm, PetscInt * len, char *** namelist, IS ** islist, DM ** >>>>>>> dmlist) >>>>>>> { >>>>>>> PetscErrorCode ierr; >>>>>>> DM_Moose * dmm = (DM_Moose *)(dm->data); >>>>>>> >>>>>>> PetscFunctionBegin; >>>>>>> /* Only called after DMSetUp(). */ >>>>>>> if (!dmm->_splitlocs) >>>>>>> PetscFunctionReturn(0); >>>>>>> *len = dmm->_splitlocs->size(); >>>>>>> if (namelist) >>>>>>> { >>>>>>> ierr = PetscMalloc(*len * sizeof(char *), namelist); >>>>>>> CHKERRQ(ierr); >>>>>>> } >>>>>>> if (islist) >>>>>>> { >>>>>>> ierr = PetscMalloc(*len * sizeof(IS), islist); >>>>>>> CHKERRQ(ierr); >>>>>>> } >>>>>>> if (dmlist) >>>>>>> { >>>>>>> ierr = PetscMalloc(*len * sizeof(DM), dmlist); >>>>>>> CHKERRQ(ierr); >>>>>>> } >>>>>>> for (const auto & dit : *(dmm->_splitlocs)) >>>>>>> { >>>>>>> unsigned int d = dit.second; >>>>>>> std::string dname = dit.first; >>>>>>> DM_Moose::SplitInfo & dinfo = (*dmm->_splits)[dname]; >>>>>>> if (!dinfo._dm) >>>>>>> { >>>>>>> ierr = DMCreateMoose(((PetscObject)dm)->comm, *dmm->_nl, >>>>>>> &dinfo._dm); >>>>>>> CHKERRQ(ierr); >>>>>>> ierr = PetscObjectSetOptionsPrefix((PetscObject)dinfo._dm, >>>>>>> ((PetscObject)dm)->prefix); >>>>>>> CHKERRQ(ierr); >>>>>>> std::string suffix = std::string("fieldsplit_") + dname + "_"; >>>>>>> ierr = PetscObjectAppendOptionsPrefix((PetscObject)dinfo._dm, >>>>>>> suffix.c_str()); >>>>>>> CHKERRQ(ierr); >>>>>>> } >>>>>>> ierr = DMSetFromOptions(dinfo._dm); >>>>>>> CHKERRQ(ierr); >>>>>>> ierr = DMSetUp(dinfo._dm); >>>>>>> CHKERRQ(ierr); >>>>>>> if (namelist) >>>>>>> { >>>>>>> ierr = PetscStrallocpy(dname.c_str(), (*namelist) + d); >>>>>>> CHKERRQ(ierr); >>>>>>> } >>>>>>> if (islist) >>>>>>> { >>>>>>> if (!dinfo._rembedding) >>>>>>> { >>>>>>> IS dembedding, lembedding; >>>>>>> ierr = DMMooseGetEmbedding_Private(dinfo._dm, &dembedding); >>>>>>> CHKERRQ(ierr); >>>>>>> if (dmm->_embedding) >>>>>>> { >>>>>>> // Create a relative embedding into the parent's index >>>>>>> space. >>>>>>> ierr = ISEmbed(dembedding, dmm->_embedding, PETSC_TRUE, >>>>>>> &lembedding); >>>>>>> CHKERRQ(ierr); >>>>>>> const PetscInt * lindices; >>>>>>> PetscInt len, dlen, llen, *rindices, off, i; >>>>>>> ierr = ISGetLocalSize(dembedding, &dlen); >>>>>>> CHKERRQ(ierr); >>>>>>> ierr = ISGetLocalSize(lembedding, &llen); >>>>>>> CHKERRQ(ierr); >>>>>>> if (llen != dlen) >>>>>>> SETERRQ1(((PetscObject)dm)->comm, PETSC_ERR_PLIB, >>>>>>> "Failed to embed split %D", d); >>>>>>> ierr = ISDestroy(&dembedding); >>>>>>> CHKERRQ(ierr); >>>>>>> // Convert local embedding to global (but still relative) >>>>>>> embedding >>>>>>> ierr = PetscMalloc(llen * sizeof(PetscInt), &rindices); >>>>>>> CHKERRQ(ierr); >>>>>>> ierr = ISGetIndices(lembedding, &lindices); >>>>>>> CHKERRQ(ierr); >>>>>>> ierr = PetscMemcpy(rindices, lindices, llen * >>>>>>> sizeof(PetscInt)); >>>>>>> CHKERRQ(ierr); >>>>>>> ierr = ISDestroy(&lembedding); >>>>>>> CHKERRQ(ierr); >>>>>>> // We could get the index offset from a corresponding >>>>>>> global vector, but subDMs don't yet >>>>>>> // have global vectors >>>>>>> ierr = ISGetLocalSize(dmm->_embedding, &len); >>>>>>> CHKERRQ(ierr); >>>>>>> >>>>>>> ierr = MPI_Scan(&len, >>>>>>> &off, >>>>>>> 1, >>>>>>> #ifdef PETSC_USE_64BIT_INDICES >>>>>>> MPI_LONG_LONG_INT, >>>>>>> #else >>>>>>> MPI_INT, >>>>>>> #endif >>>>>>> MPI_SUM, >>>>>>> ((PetscObject)dm)->comm); >>>>>>> CHKERRQ(ierr); >>>>>>> >>>>>>> off -= len; >>>>>>> for (i = 0; i < llen; ++i) >>>>>>> rindices[i] += off; >>>>>>> ierr = ISCreateGeneral( >>>>>>> ((PetscObject)dm)->comm, llen, rindices, >>>>>>> PETSC_OWN_POINTER, &(dinfo._rembedding)); >>>>>>> CHKERRQ(ierr); >>>>>>> } >>>>>>> else >>>>>>> { >>>>>>> dinfo._rembedding = dembedding; >>>>>>> } >>>>>>> } >>>>>>> ierr = PetscObjectReference((PetscObject)(dinfo._rembedding)); >>>>>>> CHKERRQ(ierr); >>>>>>> (*islist)[d] = dinfo._rembedding; >>>>>>> } >>>>>>> if (dmlist) >>>>>>> { >>>>>>> ierr = PetscObjectReference((PetscObject)dinfo._dm); >>>>>>> CHKERRQ(ierr); >>>>>>> (*dmlist)[d] = dinfo._dm; >>>>>>> } >>>>>>> } >>>>>>> PetscFunctionReturn(0); >>>>>>> } >>>>>>> >>>>>>> static PetscErrorCode >>>>>>> DMCreateDomainDecomposition_Moose( >>>>>>> DM dm, PetscInt * len, char *** namelist, IS ** innerislist, IS >>>>>>> ** outerislist, DM ** dmlist) >>>>>>> { >>>>>>> PetscErrorCode ierr; >>>>>>> >>>>>>> PetscFunctionBegin; >>>>>>> /* Use DMCreateFieldDecomposition_Moose() to obtain everything but >>>>>>> outerislist, which is currently >>>>>>> * PETSC_NULL. */ >>>>>>> if (outerislist) >>>>>>> *outerislist = PETSC_NULL; /* FIX: allow mesh-based overlap. */ >>>>>>> ierr = DMCreateFieldDecomposition_Moose(dm, len, namelist, >>>>>>> innerislist, dmlist); >>>>>>> CHKERRQ(ierr); >>>>>>> PetscFunctionReturn(0); >>>>>>> } >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Thu, Nov 3, 2022 at 5:19 PM Matthew Knepley >>>>>>> wrote: >>>>>>> >>>>>>>> On Thu, Nov 3, 2022 at 7:52 PM Alexander Lindsay < >>>>>>>> alexlindsay239 at gmail.com> wrote: >>>>>>>> >>>>>>>>> I have errors on quite a few (but not all) processes of the like >>>>>>>>> >>>>>>>>> [1]PETSC ERROR: --------------------- Error Message >>>>>>>>> -------------------------------------------------------------- >>>>>>>>> [1]PETSC ERROR: Nonconforming object sizes >>>>>>>>> [1]PETSC ERROR: Local columns of A10 4137 do not equal local rows >>>>>>>>> of A00 4129 >>>>>>>>> >>>>>>>>> when performing field splits. We (MOOSE) have some code for >>>>>>>>> identifying the index sets for each split. However, the code was written by >>>>>>>>> some authors who are no longer with us. Normally I would chase this down in >>>>>>>>> a debugger, but this error only seems to crop up for pretty complex and >>>>>>>>> large meshes. If anyone has an idea for what we might be doing wrong, that >>>>>>>>> might help me chase this down faster. I guess intuitively I'm pretty >>>>>>>>> perplexed that we could get ourselves into this pickle as it almost appears >>>>>>>>> that we have two different local dof index counts for a given block (0 in >>>>>>>>> this case). More background, if helpful, can be found in >>>>>>>>> https://github.com/idaholab/moose/issues/22359 as well as >>>>>>>>> https://github.com/idaholab/moose/discussions/22468. >>>>>>>>> >>>>>>>> >>>>>>>> How are you specifying the blocks? I would not have thought this >>>>>>>> was possible. >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> I should note that we are currently running with 3.16.6 as our >>>>>>>>> PETSc submodule hash (we are talking about updating to 3.18 soon). >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>> >>>>>>>> >>>>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Wed Nov 9 17:18:11 2022 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Wed, 9 Nov 2022 15:18:11 -0800 Subject: [petsc-users] Local columns of A10 do not equal local rows of A00 In-Reply-To: References: Message-ID: I was able to get it worked out, once I knew the issue, doing a detailed read through our split IS generation. Working great (at least on this test problem) now! On Wed, Nov 9, 2022 at 12:45 PM Matthew Knepley wrote: > On Wed, Nov 9, 2022 at 1:45 PM Alexander Lindsay > wrote: > >> Ok, I've figured out that we are definitely messing something up in our >> split index set generation. For process 1 our Jac/PMat local size is 14307, >> but split 0 IS local size is 4129 and split 1 IS local size is 10170, so >> that leaves us 8 dofs short. >> >> I know now where I need to dig in our field decomposition. Thanks Matt >> for helping me process through this stuff! >> > > Cool. The one piece of advice I have is to make the problem ruthlessly > small. Even if it seems hard, it is worth the time > to get it down to the size you can print to the screen. > > Thanks, > > Matt > > >> On Tue, Nov 8, 2022 at 4:53 PM Alexander Lindsay < >> alexlindsay239 at gmail.com> wrote: >> >>> This is from our DMCreateFieldDecomposition_Moose routine. The IS size >>> on process 1 (which is the process from which I took the error in the >>> original post) is reported as 4129 which is consistent with the row size of >>> A00. >>> >>> Split '0' has local size 4129 on processor 1 >>> Split '0' has local size 4484 on processor 6 >>> Split '0' has local size 4471 on processor 12 >>> Split '0' has local size 4040 on processor 14 >>> Split '0' has local size 3594 on processor 20 >>> Split '0' has local size 4423 on processor 22 >>> Split '0' has local size 2791 on processor 27 >>> Split '0' has local size 3014 on processor 29 >>> Split '0' has local size 3183 on processor 30 >>> Split '0' has local size 3328 on processor 3 >>> Split '0' has local size 4689 on processor 4 >>> Split '0' has local size 8016 on processor 8 >>> Split '0' has local size 6367 on processor 10 >>> Split '0' has local size 5973 on processor 17 >>> Split '0' has local size 4431 on processor 18 >>> Split '0' has local size 7564 on processor 25 >>> Split '0' has local size 12504 on processor 9 >>> Split '0' has local size 10081 on processor 11 >>> Split '0' has local size 13808 on processor 24 >>> Split '0' has local size 14049 on processor 31 >>> Split '0' has local size 15324 on processor 7 >>> Split '0' has local size 15337 on processor 15 >>> Split '0' has local size 14849 on processor 19 >>> Split '0' has local size 15660 on processor 23 >>> Split '0' has local size 14728 on processor 26 >>> Split '0' has local size 15724 on processor 28 >>> Split '0' has local size 17249 on processor 5 >>> Split '0' has local size 15519 on processor 13 >>> Split '0' has local size 16511 on processor 16 >>> Split '0' has local size 16496 on processor 21 >>> Split '0' has local size 18291 on processor 2 >>> Split '0' has local size 18042 on processor 0 >>> >>> On Mon, Nov 7, 2022 at 6:04 PM Matthew Knepley >>> wrote: >>> >>>> On Mon, Nov 7, 2022 at 5:48 PM Alexander Lindsay < >>>> alexlindsay239 at gmail.com> wrote: >>>> >>>>> My understanding looking at PCFieldSplitSetDefaults is that our >>>>> implementation of `createfielddecomposition` should get called, we'll set >>>>> `fields` and then (ignoring possible user setting of >>>>> -pc_fieldsplit_%D_fields flag) PCFieldSplitSetIS will get called with >>>>> whatever we did to `fields`. So yea I guess that just looking over that I >>>>> would assume we're not supplying two different index sets for rows and >>>>> columns, or put more precisely we (MOOSE) are not really afforded the >>>>> opportunity to. But my interpretation could very well be wrong. >>>>> >>>> >>>> Oh wait. I read the error message again. It does not say that the whole >>>> selection is rectangular. It says >>>> >>>> Local columns of A10 4137 do not equal local rows of A00 4129 >>>> >>>> So this is a parallel partitioning thing. Since A00 has 4129 local >>>> rows, it should have this many columns as well. >>>> However A10 has 4137 local columns. How big is IS_0, on each process, >>>> that you pass in to PCFIELDSPLIT? >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> On Mon, Nov 7, 2022 at 12:33 PM Matthew Knepley >>>>> wrote: >>>>> >>>>>> On Mon, Nov 7, 2022 at 2:09 PM Alexander Lindsay < >>>>>> alexlindsay239 at gmail.com> wrote: >>>>>> >>>>>>> The libMesh/MOOSE specific code that identifies dof indices for >>>>>>> ISCreateGeneral is in DMooseGetEmbedding_Private. I can share that function >>>>>>> (it's quite long) or more details if that could be helpful. >>>>>>> >>>>>> >>>>>> Sorry, I should have written more. The puzzling thing for me is that >>>>>> somehow it looks like the row and column index sets are not the same. I did >>>>>> not think >>>>>> PCFIELDSPLIT could do that. The PCFieldSplitSetIS() interface does >>>>>> not allow it. I was wondering how you were setting the ISes. >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> On Mon, Nov 7, 2022 at 10:55 AM Alexander Lindsay < >>>>>>> alexlindsay239 at gmail.com> wrote: >>>>>>> >>>>>>>> I'm not sure exactly what you mean, but I'll try to give more >>>>>>>> details. We have our own DM class (DM_Moose) and we set our own field and >>>>>>>> domain decomposition routines: >>>>>>>> >>>>>>>> dm->ops->createfielddecomposition = >>>>>>>> DMCreateFieldDecomposition_Moose; >>>>>>>> >>>>>>>> dm->ops->createdomaindecomposition = >>>>>>>> DMCreateDomainDecomposition_Moose; >>>>>>>> >>>>>>>> >>>>>>>> The field and domain decomposition routines are as follows (can see >>>>>>>> also at >>>>>>>> https://github.com/idaholab/moose/blob/next/framework/src/utils/PetscDMMoose.C >>>>>>>> ): >>>>>>>> >>>>>>>> static PetscErrorCode >>>>>>>> DMCreateFieldDecomposition_Moose( >>>>>>>> DM dm, PetscInt * len, char *** namelist, IS ** islist, DM ** >>>>>>>> dmlist) >>>>>>>> { >>>>>>>> PetscErrorCode ierr; >>>>>>>> DM_Moose * dmm = (DM_Moose *)(dm->data); >>>>>>>> >>>>>>>> PetscFunctionBegin; >>>>>>>> /* Only called after DMSetUp(). */ >>>>>>>> if (!dmm->_splitlocs) >>>>>>>> PetscFunctionReturn(0); >>>>>>>> *len = dmm->_splitlocs->size(); >>>>>>>> if (namelist) >>>>>>>> { >>>>>>>> ierr = PetscMalloc(*len * sizeof(char *), namelist); >>>>>>>> CHKERRQ(ierr); >>>>>>>> } >>>>>>>> if (islist) >>>>>>>> { >>>>>>>> ierr = PetscMalloc(*len * sizeof(IS), islist); >>>>>>>> CHKERRQ(ierr); >>>>>>>> } >>>>>>>> if (dmlist) >>>>>>>> { >>>>>>>> ierr = PetscMalloc(*len * sizeof(DM), dmlist); >>>>>>>> CHKERRQ(ierr); >>>>>>>> } >>>>>>>> for (const auto & dit : *(dmm->_splitlocs)) >>>>>>>> { >>>>>>>> unsigned int d = dit.second; >>>>>>>> std::string dname = dit.first; >>>>>>>> DM_Moose::SplitInfo & dinfo = (*dmm->_splits)[dname]; >>>>>>>> if (!dinfo._dm) >>>>>>>> { >>>>>>>> ierr = DMCreateMoose(((PetscObject)dm)->comm, *dmm->_nl, >>>>>>>> &dinfo._dm); >>>>>>>> CHKERRQ(ierr); >>>>>>>> ierr = PetscObjectSetOptionsPrefix((PetscObject)dinfo._dm, >>>>>>>> ((PetscObject)dm)->prefix); >>>>>>>> CHKERRQ(ierr); >>>>>>>> std::string suffix = std::string("fieldsplit_") + dname + "_"; >>>>>>>> ierr = PetscObjectAppendOptionsPrefix((PetscObject)dinfo._dm, >>>>>>>> suffix.c_str()); >>>>>>>> CHKERRQ(ierr); >>>>>>>> } >>>>>>>> ierr = DMSetFromOptions(dinfo._dm); >>>>>>>> CHKERRQ(ierr); >>>>>>>> ierr = DMSetUp(dinfo._dm); >>>>>>>> CHKERRQ(ierr); >>>>>>>> if (namelist) >>>>>>>> { >>>>>>>> ierr = PetscStrallocpy(dname.c_str(), (*namelist) + d); >>>>>>>> CHKERRQ(ierr); >>>>>>>> } >>>>>>>> if (islist) >>>>>>>> { >>>>>>>> if (!dinfo._rembedding) >>>>>>>> { >>>>>>>> IS dembedding, lembedding; >>>>>>>> ierr = DMMooseGetEmbedding_Private(dinfo._dm, &dembedding); >>>>>>>> CHKERRQ(ierr); >>>>>>>> if (dmm->_embedding) >>>>>>>> { >>>>>>>> // Create a relative embedding into the parent's index >>>>>>>> space. >>>>>>>> ierr = ISEmbed(dembedding, dmm->_embedding, PETSC_TRUE, >>>>>>>> &lembedding); >>>>>>>> CHKERRQ(ierr); >>>>>>>> const PetscInt * lindices; >>>>>>>> PetscInt len, dlen, llen, *rindices, off, i; >>>>>>>> ierr = ISGetLocalSize(dembedding, &dlen); >>>>>>>> CHKERRQ(ierr); >>>>>>>> ierr = ISGetLocalSize(lembedding, &llen); >>>>>>>> CHKERRQ(ierr); >>>>>>>> if (llen != dlen) >>>>>>>> SETERRQ1(((PetscObject)dm)->comm, PETSC_ERR_PLIB, >>>>>>>> "Failed to embed split %D", d); >>>>>>>> ierr = ISDestroy(&dembedding); >>>>>>>> CHKERRQ(ierr); >>>>>>>> // Convert local embedding to global (but still relative) >>>>>>>> embedding >>>>>>>> ierr = PetscMalloc(llen * sizeof(PetscInt), &rindices); >>>>>>>> CHKERRQ(ierr); >>>>>>>> ierr = ISGetIndices(lembedding, &lindices); >>>>>>>> CHKERRQ(ierr); >>>>>>>> ierr = PetscMemcpy(rindices, lindices, llen * >>>>>>>> sizeof(PetscInt)); >>>>>>>> CHKERRQ(ierr); >>>>>>>> ierr = ISDestroy(&lembedding); >>>>>>>> CHKERRQ(ierr); >>>>>>>> // We could get the index offset from a corresponding >>>>>>>> global vector, but subDMs don't yet >>>>>>>> // have global vectors >>>>>>>> ierr = ISGetLocalSize(dmm->_embedding, &len); >>>>>>>> CHKERRQ(ierr); >>>>>>>> >>>>>>>> ierr = MPI_Scan(&len, >>>>>>>> &off, >>>>>>>> 1, >>>>>>>> #ifdef PETSC_USE_64BIT_INDICES >>>>>>>> MPI_LONG_LONG_INT, >>>>>>>> #else >>>>>>>> MPI_INT, >>>>>>>> #endif >>>>>>>> MPI_SUM, >>>>>>>> ((PetscObject)dm)->comm); >>>>>>>> CHKERRQ(ierr); >>>>>>>> >>>>>>>> off -= len; >>>>>>>> for (i = 0; i < llen; ++i) >>>>>>>> rindices[i] += off; >>>>>>>> ierr = ISCreateGeneral( >>>>>>>> ((PetscObject)dm)->comm, llen, rindices, >>>>>>>> PETSC_OWN_POINTER, &(dinfo._rembedding)); >>>>>>>> CHKERRQ(ierr); >>>>>>>> } >>>>>>>> else >>>>>>>> { >>>>>>>> dinfo._rembedding = dembedding; >>>>>>>> } >>>>>>>> } >>>>>>>> ierr = PetscObjectReference((PetscObject)(dinfo._rembedding)); >>>>>>>> CHKERRQ(ierr); >>>>>>>> (*islist)[d] = dinfo._rembedding; >>>>>>>> } >>>>>>>> if (dmlist) >>>>>>>> { >>>>>>>> ierr = PetscObjectReference((PetscObject)dinfo._dm); >>>>>>>> CHKERRQ(ierr); >>>>>>>> (*dmlist)[d] = dinfo._dm; >>>>>>>> } >>>>>>>> } >>>>>>>> PetscFunctionReturn(0); >>>>>>>> } >>>>>>>> >>>>>>>> static PetscErrorCode >>>>>>>> DMCreateDomainDecomposition_Moose( >>>>>>>> DM dm, PetscInt * len, char *** namelist, IS ** innerislist, IS >>>>>>>> ** outerislist, DM ** dmlist) >>>>>>>> { >>>>>>>> PetscErrorCode ierr; >>>>>>>> >>>>>>>> PetscFunctionBegin; >>>>>>>> /* Use DMCreateFieldDecomposition_Moose() to obtain everything >>>>>>>> but outerislist, which is currently >>>>>>>> * PETSC_NULL. */ >>>>>>>> if (outerislist) >>>>>>>> *outerislist = PETSC_NULL; /* FIX: allow mesh-based overlap. */ >>>>>>>> ierr = DMCreateFieldDecomposition_Moose(dm, len, namelist, >>>>>>>> innerislist, dmlist); >>>>>>>> CHKERRQ(ierr); >>>>>>>> PetscFunctionReturn(0); >>>>>>>> } >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Thu, Nov 3, 2022 at 5:19 PM Matthew Knepley >>>>>>>> wrote: >>>>>>>> >>>>>>>>> On Thu, Nov 3, 2022 at 7:52 PM Alexander Lindsay < >>>>>>>>> alexlindsay239 at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> I have errors on quite a few (but not all) processes of the like >>>>>>>>>> >>>>>>>>>> [1]PETSC ERROR: --------------------- Error Message >>>>>>>>>> -------------------------------------------------------------- >>>>>>>>>> [1]PETSC ERROR: Nonconforming object sizes >>>>>>>>>> [1]PETSC ERROR: Local columns of A10 4137 do not equal local rows >>>>>>>>>> of A00 4129 >>>>>>>>>> >>>>>>>>>> when performing field splits. We (MOOSE) have some code for >>>>>>>>>> identifying the index sets for each split. However, the code was written by >>>>>>>>>> some authors who are no longer with us. Normally I would chase this down in >>>>>>>>>> a debugger, but this error only seems to crop up for pretty complex and >>>>>>>>>> large meshes. If anyone has an idea for what we might be doing wrong, that >>>>>>>>>> might help me chase this down faster. I guess intuitively I'm pretty >>>>>>>>>> perplexed that we could get ourselves into this pickle as it almost appears >>>>>>>>>> that we have two different local dof index counts for a given block (0 in >>>>>>>>>> this case). More background, if helpful, can be found in >>>>>>>>>> https://github.com/idaholab/moose/issues/22359 as well as >>>>>>>>>> https://github.com/idaholab/moose/discussions/22468. >>>>>>>>>> >>>>>>>>> >>>>>>>>> How are you specifying the blocks? I would not have thought this >>>>>>>>> was possible. >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> I should note that we are currently running with 3.16.6 as our >>>>>>>>>> PETSc submodule hash (we are talking about updating to 3.18 soon). >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>> >>>>>> >>>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 9 17:56:28 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 9 Nov 2022 18:56:28 -0500 Subject: [petsc-users] Reference element in DMPlexComputeCellGeometryAffineFEM In-Reply-To: <1223C850-C305-4475-BBF0-F907C8739C1C@mcmaster.ca> References: <1223C850-C305-4475-BBF0-F907C8739C1C@mcmaster.ca> Message-ID: On Wed, Nov 9, 2022 at 10:46 AM Blaise Bourdin wrote: > > > On Nov 9, 2022, at 10:04 AM, Matthew Knepley wrote: > > On Tue, Nov 8, 2022 at 9:14 PM Blaise Bourdin wrote: > > Hi, > > What reference simplex is DMPlexComputeCellGeometryAffineFEM using in 2 > and 3D? > I am used to computing my shape functions on the unit simplex (vertices at > the origin and each e_i), but it does not look to be the reference simplex > in this function: > > In 3D, for the unit simplex with vertices at (0,0,0) (1,0,0) (0,1,0) > (0,0,1) (in this order), I get J = 1 / 2 . [[-1,-1,-1],[1,0,0],[0,0,1]] and > v0 = [0,0,1] > > In 2D, for the unit simplex with vertices at (0,0), (1,0), and (0,1), I > get J = 1 / 2. I and v0 = [0,0], which does not make any sense to me (I was > assuming that the 2D reference simplex had vertices at (-1,-1), (1, -1) and > (-1,1), but if this were the case, v0 would not be 0). > > I can build a simple example with meshes consisting only of the unit > simplex in 2D and 3D if that would help. > > > I need to rewrite the documentation on geometry, but I was waiting until I > rewrite the geometry calculations to fit into libCEED. Toby found a nice > way to express them in BLAS form which I need to push through everything. > > I always think of operating on the cell with the first vertex at the > origin (I think it is easier), so I have a xi0 that translates the first > vertex > of the reference to the origin, and a v0 that translates the first vertex > of the real cell to the origin. You can see this here > > > https://gitlab.com/petsc/petsc/-/blob/main/include/petsc/private/petscfeimpl.h#L251 > > This explains the 2D result. I cannot understand your 3D result, unless > the vertices are in another order. > > > That makes two of us, then? I am attaching a small example and test meshes > (one cell being the unit simplex starting with the origin and numbered in > direct order when looking from (1,1,1) > Oh, it is probably inverted. All faces are oriented for outward normals. It is in the Orientation chapter in the book :) Thanks, Matt > filename ../TestMeshes/1Tri.gen > > Vec Object: coordinates 1 MPI process > > type: seq > > 0. > > 0. > > 1. > > 0. > > 0. > > 1. > > v0 > > 0: 0.0000e+00 0.0000e+00 > > J > > 0: 5.0000e-01 0.0000e+00 > > 0: 0.0000e+00 5.0000e-01 > > invJ > > 0: 2.0000e+00 -0.0000e+00 > > 0: -0.0000e+00 2.0000e+00 > > detJ : 0.25 > > And > > filename ../TestMeshes/1Tet.gen > > Vec Object: coordinates 1 MPI process > > type: seq > > 0. > > 0. > > 0. > > 1. > > 0. > > 0. > > 0. > > 1. > > 0. > > 0. > > 0. > > 1. > > v0 > > 0: 1.0000e+00 0.0000e+00 0.0000e+00 > > J > > 0: -5.0000e-01 -5.0000e-01 -5.0000e-01 > > 0: 5.0000e-01 0.0000e+00 0.0000e+00 > > 0: 0.0000e+00 0.0000e+00 5.0000e-01 > > invJ > > 0: 0.0000e+00 2.0000e+00 0.0000e+00 > > 0: -2.0000e+00 -2.0000e+00 -2.0000e+00 > > 0: 0.0000e+00 0.0000e+00 2.0000e+00 > > detJ : 0.125 > > I don?t understand why v0=(0,0) in 2D and (1,0,0) in 3D (but don?t really > care) since I only want J. J makes no sense to me in 3D. In particular, one > does not seem to have X~ = invJ.X + v0 (X = J.(X~-v0) as stated in > CoordinatesRefToReal (it works in 2D if V0 = (1,1), which is consistent > with a reference simplex with vertices at (-1,-1), (1,-1) and (-1,1)). > > What am I missing? > > Blaise > > / > > > > Thanks, > > Matt > > > Regards, > Blaise > > > > ? > Canada Research Chair in Mathematical and Computational Aspects of Solid > Mechanics (Tier 1) > Professor, Department of Mathematics & Statistics > Hamilton Hall room 409A, McMaster University > 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada > https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243 > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > ? > Canada Research Chair in Mathematical and Computational Aspects of Solid > Mechanics (Tier 1) > Professor, Department of Mathematics & Statistics > Hamilton Hall room 409A, McMaster University > 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada > https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Thu Nov 10 02:12:03 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 10 Nov 2022 09:12:03 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: Hello, I have tried a different way to create the splitting: ui(1) = 0 ui(2) = 1 pi(1) = 2 call ISCreateGeneral(PETSC_COMM_WORLD, 2, ui, PETSC_COPY_VALUES, isu, ierr) call ISCreateGeneral(PETSC_COMM_WORLD, 1, pi, PETSC_COPY_VALUES, isp, ierr) call PCFieldSplitSetIS(mypc, "0", isu, ierr) call PCFieldSplitSetIS(mypc, "1", isp, ierr) However I get even worst erros! *[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------* *[0]PETSC ERROR: Nonconforming object sizes* *[0]PETSC ERROR: Local column sizes 0 do not add up to total number of columns 1* *[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.* ISes looks like: IS Object: 1 MPI process *---> isu* type: general Number of indices in set 2 0 0 1 1 IS Object: 1 MPI process * ---> isp* type: general Number of indices in set 1 0 2 I really have no idea on how to deal with this thing... ? I guess I am not setting ok the index of the splitting, but cannot really figure out what I am doing wrong. Any help is much appreciated :) -------------- next part -------------- An HTML attachment was scrubbed... URL: From tangqi at msu.edu Thu Nov 10 02:18:32 2022 From: tangqi at msu.edu (Tang, Qi) Date: Thu, 10 Nov 2022 08:18:32 +0000 Subject: [petsc-users] Get solution and rhs in the ts monitor Message-ID: Hi, How could I get rhs and solution in a ksp solve of ts? I am testing a linear problem (TS_Linear) using a bdf integrator. I tried to get the operator, rhs, and solution in the ts monitor through TSGetKSP and KSPGet***. But r = Ax-b is much larger than the ksp norm. I know the solver works fine. Did I misunderstand something about how TS works here? Perhaps one of the vectors is changed after the ksp solve? If so, is there a simple way to get rhs and solution that ksp of ts solved? Thanks, Qi -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 10 05:15:36 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 10 Nov 2022 06:15:36 -0500 Subject: [petsc-users] Get solution and rhs in the ts monitor In-Reply-To: References: Message-ID: On Thu, Nov 10, 2022 at 3:18 AM Tang, Qi wrote: > Hi, > > How could I get rhs and solution in a ksp solve of ts? > > I am testing a linear problem (TS_Linear) using a bdf integrator. I tried > to get the operator, rhs, and solution in the ts monitor through TSGetKSP > and KSPGet***. But r = Ax-b is much larger than the ksp norm. I know the > solver works fine. > Ax - b is the _unpreconditioned_ norm. By default we are printing the preconditioned norm. You can see the difference by running with -ksp_monitor_true_residual Thanks, Matt > Did I misunderstand something about how TS works here? Perhaps one of the > vectors is changed after the ksp solve? If so, is there a simple way to get > rhs and solution that ksp of ts solved? > > Thanks, > Qi > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tangqi at msu.edu Thu Nov 10 10:00:53 2022 From: tangqi at msu.edu (Tang, Qi) Date: Thu, 10 Nov 2022 16:00:53 +0000 Subject: [petsc-users] Get solution and rhs in the ts monitor In-Reply-To: References: Message-ID: Yes, but I need to get A, x and b out, so that I can test them in pyamg for other preconditioner options. I can get A, x, and b through what I described, but I do not think x or b is the original one in the linear system. Is there a simple way to get x and b (I just need once) of TS? Thanks. Qi ________________________________ From: Matthew Knepley Sent: Thursday, November 10, 2022 6:15 AM To: Tang, Qi Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Get solution and rhs in the ts monitor On Thu, Nov 10, 2022 at 3:18 AM Tang, Qi > wrote: Hi, How could I get rhs and solution in a ksp solve of ts? I am testing a linear problem (TS_Linear) using a bdf integrator. I tried to get the operator, rhs, and solution in the ts monitor through TSGetKSP and KSPGet***. But r = Ax-b is much larger than the ksp norm. I know the solver works fine. Ax - b is the _unpreconditioned_ norm. By default we are printing the preconditioned norm. You can see the difference by running with -ksp_monitor_true_residual Thanks, Matt Did I misunderstand something about how TS works here? Perhaps one of the vectors is changed after the ksp solve? If so, is there a simple way to get rhs and solution that ksp of ts solved? Thanks, Qi -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.zampini at gmail.com Thu Nov 10 10:15:05 2022 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Thu, 10 Nov 2022 17:15:05 +0100 Subject: [petsc-users] Get solution and rhs in the ts monitor In-Reply-To: References: Message-ID: -ksp_view_mat -ksp_view_rhs -ksp_view_solution > On Nov 10, 2022, at 5:00 PM, Tang, Qi wrote: > > Yes, but I need to get A, x and b out, so that I can test them in pyamg for other preconditioner options. I can get A, x, and b through what I described, but I do not think x or b is the original one in the linear system. > > Is there a simple way to get x and b (I just need once) of TS? Thanks. > > Qi > From: Matthew Knepley > Sent: Thursday, November 10, 2022 6:15 AM > To: Tang, Qi > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Get solution and rhs in the ts monitor > > On Thu, Nov 10, 2022 at 3:18 AM Tang, Qi > wrote: > Hi, > > How could I get rhs and solution in a ksp solve of ts? > > I am testing a linear problem (TS_Linear) using a bdf integrator. I tried to get the operator, rhs, and solution in the ts monitor through TSGetKSP and KSPGet***. But r = Ax-b is much larger than the ksp norm. I know the solver works fine. > > Ax - b is the _unpreconditioned_ norm. By default we are printing the preconditioned norm. You can see the difference by running with > > -ksp_monitor_true_residual > > Thanks, > > Matt > > Did I misunderstand something about how TS works here? Perhaps one of the vectors is changed after the ksp solve? If so, is there a simple way to get rhs and solution that ksp of ts solved? > > Thanks, > Qi > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tangqi at msu.edu Thu Nov 10 12:00:58 2022 From: tangqi at msu.edu (Tang, Qi) Date: Thu, 10 Nov 2022 18:00:58 +0000 Subject: [petsc-users] Get solution and rhs in the ts monitor In-Reply-To: References: Message-ID: <69F9BF2E-6222-4430-9456-B13E4FAF7E24@msu.edu> Thanks a lot, that works. This confirms that I can reproduce the rhs I need. On Nov 10, 2022, at 9:15 AM, Stefano Zampini wrote: ? -ksp_view_mat -ksp_view_rhs -ksp_view_solution On Nov 10, 2022, at 5:00 PM, Tang, Qi > wrote: Yes, but I need to get A, x and b out, so that I can test them in pyamg for other preconditioner options. I can get A, x, and b through what I described, but I do not think x or b is the original one in the linear system. Is there a simple way to get x and b (I just need once) of TS? Thanks. Qi ________________________________ From: Matthew Knepley > Sent: Thursday, November 10, 2022 6:15 AM To: Tang, Qi > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Get solution and rhs in the ts monitor On Thu, Nov 10, 2022 at 3:18 AM Tang, Qi > wrote: Hi, How could I get rhs and solution in a ksp solve of ts? I am testing a linear problem (TS_Linear) using a bdf integrator. I tried to get the operator, rhs, and solution in the ts monitor through TSGetKSP and KSPGet***. But r = Ax-b is much larger than the ksp norm. I know the solver works fine. Ax - b is the _unpreconditioned_ norm. By default we are printing the preconditioned norm. You can see the difference by running with -ksp_monitor_true_residual Thanks, Matt Did I misunderstand something about how TS works here? Perhaps one of the vectors is changed after the ksp solve? If so, is there a simple way to get rhs and solution that ksp of ts solved? Thanks, Qi -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Thu Nov 10 13:14:22 2022 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 10 Nov 2022 14:14:22 -0500 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: These beasts should be PetscInt, not real real :: ufields(2), pfields(1) Side note. We do not recommend using options like -fdefault-real-8 because the compiler may change values in surprising ways. You can use PetscReal to declare real numbers and this will automatically match with single or double precision based on the PETSc configure options. What version of PETSc are you using? We don't have Fortran stubs for the calls to PCFieldSplitSetFields in the latest release. I should add them. > On Nov 9, 2022, at 12:00 PM, Edoardo alinovi wrote: > > Hi Matt, > > it took a bit more than 1s, but I can reproduce the error in the attached file. > > To compile: > mpifort -L$PETSC_DIR/$PETSC_ARCH/lib -lpetsc -fdefault-real-8 -o test test.F90 -I$PETSC_DIR/include -I$PETSC_DIR/$PETSC_ARCH/include > > Please run it in serial as I have hardcoded some dimensions to code this up faster. > > Thank you! > -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Thu Nov 10 14:06:58 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 10 Nov 2022 21:06:58 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: Hi Barry, Thanks a lot for getting back to me, I am quite stuck at the moment! Matt kindly added them in a dev branch I am using right now to test this pc. You are right I am declaring them badly, I am an idiot! My small test works now, but I'm still in trouble with the main code unfortunately. There the indices are integers, but I am invoking the field splitting before assembling the matrix (thus doing MatAssemblyBegin/End). Does the matrix need to be fully assembled before calling PCFieldSplitSetFields? Il Gio 10 Nov 2022, 20:14 Barry Smith ha scritto: > > These beasts should be PetscInt, not real > > real :: ufields(2), pfields(1) > > > Side note. We do not recommend using options like *-fdefault-real-8 *because > the compiler may change values in surprising ways. You can use PetscReal > to declare real numbers and this will automatically match with single or > double precision based on the PETSc configure options. > > What version of PETSc are you using? We don't have Fortran stubs for > the calls to PCFieldSplitSetFields in the latest release. I should add them. > > > > On Nov 9, 2022, at 12:00 PM, Edoardo alinovi > wrote: > > Hi Matt, > > it took a bit more than 1s, but I can reproduce the error in the > attached file. > > To compile: > *mpifort -L$PETSC_DIR/$PETSC_ARCH/lib -lpetsc -fdefault-real-8 -o test > test.F90 -I$PETSC_DIR/include -I$PETSC_DIR/$PETSC_ARCH/include* > > Please run it in serial as I have hardcoded some dimensions to code this > up faster. > > Thank you! > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bourdin at mcmaster.ca Thu Nov 10 14:46:20 2022 From: bourdin at mcmaster.ca (Blaise Bourdin) Date: Thu, 10 Nov 2022 20:46:20 +0000 Subject: [petsc-users] Reference element in DMPlexComputeCellGeometryAffineFEM In-Reply-To: References: <1223C850-C305-4475-BBF0-F907C8739C1C@mcmaster.ca> Message-ID: <26D68FF6-8BE4-4A8C-B0C3-BC8FE84A43FC@mcmaster.ca> An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: TetCubit2.gen Type: application/octet-stream Size: 3200 bytes Desc: TetCubit2.gen URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: PastedGraphic-1.tiff Type: image/tiff Size: 28792 bytes Desc: PastedGraphic-1.tiff URL: From edoardo.alinovi at gmail.com Thu Nov 10 14:48:27 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 10 Nov 2022 21:48:27 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: I am sorry Barry, I told you it works, but it is not. I changed to index to integer, but I am still getting this: [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Nonconforming object sizes [0]PETSC ERROR: Local column sizes 6 do not add up to total number of columns 9 I am a bit tired and I was running linux's test instead of my program -.- This is the branch Matt did if you whish to try... -> https://gitlab.com/petsc/petsc/-/commits/knepley/fix-fieldsplit-fortran Cheers -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Thu Nov 10 14:52:38 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 10 Nov 2022 21:52:38 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: The thing is, even I pass the following options: -UPeqn_pc_type fieldsplit -UPeqn_pc_fieldsplit_0_fields 0,1 -UPeqn_pc_fieldsplit_1_fields 2 -UPeqn_pc_fieldsplit_type SCHUR -UPeqn_pc_fieldsplit_block_size 3 I am getting the same issue, so there must be something fundamental in the way I am splitting :/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Thu Nov 10 16:15:21 2022 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 10 Nov 2022 17:15:21 -0500 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: Hmm, that branch does not appear to exist. > On Nov 10, 2022, at 3:48 PM, Edoardo alinovi wrote: > > I am sorry Barry, > > I told you it works, but it is not. I changed to index to integer, but I am still getting this: > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Nonconforming object sizes > [0]PETSC ERROR: Local column sizes 6 do not add up to total number of columns 9 > > I am a bit tired and I was running linux's test instead of my program -.- > > This is the branch Matt did if you whish to try... -> https://gitlab.com/petsc/petsc/-/commits/knepley/fix-fieldsplit-fortran > > Cheers -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Thu Nov 10 16:16:08 2022 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 10 Nov 2022 17:16:08 -0500 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: Can you share the code that produces the problem below? > On Nov 10, 2022, at 3:52 PM, Edoardo alinovi wrote: > > The thing is, even I pass the following options: > > -UPeqn_pc_type fieldsplit -UPeqn_pc_fieldsplit_0_fields 0,1 -UPeqn_pc_fieldsplit_1_fields 2 -UPeqn_pc_fieldsplit_type SCHUR -UPeqn_pc_fieldsplit_block_size 3 > > I am getting the same issue, so there must be something fundamental in the way I am splitting :/ > > From edoardo.alinovi at gmail.com Thu Nov 10 16:24:04 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 10 Nov 2022 23:24:04 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: True, Maybe somebody merged it already? I have attached my silly example. To compile: mpifort -L$PETSC_DIR/$PETSC_ARCH/lib -lpetsc -fdefault-real-8 -o test test.F90 -I$PETSC_DIR/include -I$PETSC_DIR/$PETSC_ARCH/include Do you need the petsc code MAtt did? -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: test.F90 Type: application/octet-stream Size: 3048 bytes Desc: not available URL: From edoardo.alinovi at gmail.com Thu Nov 10 16:28:04 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 10 Nov 2022 23:28:04 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: Ah I see you have already added the missing interfaces for fortran enthusiasts :) So you likely do not need Matt's hack! [image: image.png] -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 34126 bytes Desc: not available URL: From knepley at gmail.com Thu Nov 10 17:42:48 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 10 Nov 2022 18:42:48 -0500 Subject: [petsc-users] Reference element in DMPlexComputeCellGeometryAffineFEM In-Reply-To: <26D68FF6-8BE4-4A8C-B0C3-BC8FE84A43FC@mcmaster.ca> References: <1223C850-C305-4475-BBF0-F907C8739C1C@mcmaster.ca> <26D68FF6-8BE4-4A8C-B0C3-BC8FE84A43FC@mcmaster.ca> Message-ID: On Thu, Nov 10, 2022 at 3:46 PM Blaise Bourdin wrote: > I am not sure I am buying this? If the tet was inverted, detJ would be > negative, but it is always 1/8, as expected. > > The attached mesh is a perfectly valid tet generated by Cubit, with > orientation matching the exodus documentation (ignore the mid-edge dof > since this is a tet4). > Here is what I get out of the code I attached in my previous email: > Yes, I use the opposite convention from ExodusII. In my opinion, orienting face (1, 2, 3) to have an inward normal is sacrilegious. Thanks, Matt > *SiMini*:Tests (dmplex)$ ./TestDMPlexComputeCellGeometryAffineFEM -i > ../TestMeshes/TetCubit.gen > > filename ../TestMeshes/TetCubit.gen > > Vec Object: coordinates 1 MPI process > > type: seq > > 0. > > 0. > > 0. > > 1. > > 0. > > 0. > > 0. > > 1. > > 0. > > 0. > > 0. > > 1. > > v0 > > 0: 1.0000e+00 0.0000e+00 0.0000e+00 > > J > > 0: -5.0000e-01 -5.0000e-01 -5.0000e-01 > > 0: 5.0000e-01 0.0000e+00 0.0000e+00 > > 0: 0.0000e+00 0.0000e+00 5.0000e-01 > > invJ > > 0: 0.0000e+00 2.0000e+00 0.0000e+00 > > 0: -2.0000e+00 -2.0000e+00 -2.0000e+00 > > 0: 0.0000e+00 0.0000e+00 2.0000e+00 > > detJ : 0.125 > > From J, invJ, and v0, I still can?t reconstruct a reasonable reference tet > which I was naively assuming was either the unit simplex, or the simplex > with vertices at (-1,-1,-1), (-1,0,-1), (0, -1, -1), and (-1,-1,1) not > necessarily in this order. In order to build my FE basis functions on the > reference element, I really need to know what this element is? > > Blaise > > > > > On Nov 9, 2022, at 6:56 PM, Matthew Knepley wrote: > > On Wed, Nov 9, 2022 at 10:46 AM Blaise Bourdin > wrote: > > > > On Nov 9, 2022, at 10:04 AM, Matthew Knepley wrote: > > On Tue, Nov 8, 2022 at 9:14 PM Blaise Bourdin wrote: > > Hi, > > What reference simplex is DMPlexComputeCellGeometryAffineFEM using in 2 > and 3D? > I am used to computing my shape functions on the unit simplex (vertices at > the origin and each e_i), but it does not look to be the reference simplex > in this function: > > In 3D, for the unit simplex with vertices at (0,0,0) (1,0,0) (0,1,0) > (0,0,1) (in this order), I get J = 1 / 2 . [[-1,-1,-1],[1,0,0],[0,0,1]] and > v0 = [0,0,1] > > In 2D, for the unit simplex with vertices at (0,0), (1,0), and (0,1), I > get J = 1 / 2. I and v0 = [0,0], which does not make any sense to me (I was > assuming that the 2D reference simplex had vertices at (-1,-1), (1, -1) and > (-1,1), but if this were the case, v0 would not be 0). > > I can build a simple example with meshes consisting only of the unit > simplex in 2D and 3D if that would help. > > > I need to rewrite the documentation on geometry, but I was waiting until I > rewrite the geometry calculations to fit into libCEED. Toby found a nice > way to express them in BLAS form which I need to push through everything. > > I always think of operating on the cell with the first vertex at the > origin (I think it is easier), so I have a xi0 that translates the first > vertex > of the reference to the origin, and a v0 that translates the first vertex > of the real cell to the origin. You can see this here > > > https://gitlab.com/petsc/petsc/-/blob/main/include/petsc/private/petscfeimpl.h#L251 > > This explains the 2D result. I cannot understand your 3D result, unless > the vertices are in another order. > > > That makes two of us, then? I am attaching a small example and test meshes > (one cell being the unit simplex starting with the origin and numbered in > direct order when looking from (1,1,1) > > > Oh, it is probably inverted. All faces are oriented for outward normals. > It is in the Orientation chapter in the book :) > > Thanks, > > Matt > > > filename ../TestMeshes/1Tri.gen > Vec Object: coordinates 1 MPI process > type: seq > 0. > 0. > 1. > 0. > 0. > 1. > v0 > 0: 0.0000e+00 0.0000e+00 > J > 0: 5.0000e-01 0.0000e+00 > 0: 0.0000e+00 5.0000e-01 > invJ > 0: 2.0000e+00 -0.0000e+00 > 0: -0.0000e+00 2.0000e+00 > detJ : 0.25 > > And > filename ../TestMeshes/1Tet.gen > Vec Object: coordinates 1 MPI process > type: seq > 0. > 0. > 0. > 1. > 0. > > 0. > 0. > 1. > 0. > 0. > 0. > 1. > v0 > 0: 1.0000e+00 0.0000e+00 0.0000e+00 > J > 0: -5.0000e-01 -5.0000e-01 -5.0000e-01 > 0: 5.0000e-01 0.0000e+00 0.0000e+00 > 0: 0.0000e+00 0.0000e+00 5.0000e-01 > invJ > 0: 0.0000e+00 2.0000e+00 0.0000e+00 > 0: -2.0000e+00 -2.0000e+00 -2.0000e+00 > 0: 0.0000e+00 0.0000e+00 2.0000e+00 > detJ : 0.125 > > I don?t understand why v0=(0,0) in 2D and (1,0,0) in 3D (but don?t really > care) since I only want J. J makes no sense to me in 3D. In particular, one > does not seem to have X~ = invJ.X + v0 (X = J.(X~-v0) as stated in > CoordinatesRefToReal (it works in 2D if V0 = (1,1), which is consistent > with a reference simplex with vertices at (-1,-1), (1,-1) and (-1,1)). > > What am I missing? > > Blaise > > / > > > > Thanks, > > Matt > > > Regards, > Blaise > > > > ? > Canada Research Chair in Mathematical and Computational Aspects of Solid > Mechanics (Tier 1) > Professor, Department of Mathematics & Statistics > Hamilton Hall room 409A, McMaster University > 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada > https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243 > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > ? > Canada Research Chair in Mathematical and Computational Aspects of Solid > Mechanics (Tier 1) > Professor, Department of Mathematics & Statistics > Hamilton Hall room 409A, McMaster University > 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada > https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243 > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > ? > Canada Research Chair in Mathematical and Computational Aspects of Solid > Mechanics (Tier 1) > Professor, Department of Mathematics & Statistics > Hamilton Hall room 409A, McMaster University > 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada > https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Fri Nov 11 13:04:13 2022 From: bsmith at petsc.dev (Barry Smith) Date: Fri, 11 Nov 2022 14:04:13 -0500 Subject: [petsc-users] Report Bug TaoALMM class In-Reply-To: <892a51c2-17f7-ac1f-f55d-05981978a4f4@math.tu-freiberg.de> References: <4eec06f9-d534-7a02-9abe-6d1415f663f0@math.tu-freiberg.de> <14f2cdd6-9cbe-20a6-0c7d-3006b2ee4dc1@math.tu-freiberg.de> <5E53FE56-5C68-4F06-8A48-54ACBDC800C7@petsc.dev> <892a51c2-17f7-ac1f-f55d-05981978a4f4@math.tu-freiberg.de> Message-ID: > On Nov 4, 2022, at 7:43 AM, Stephan K?hler wrote: > > Barry, > > this is a nonartificial code. This is a problem in the ALMM subsolver. I want to solve a problem with a TaoALMM solver what then happens is: > > TaoSolve(tao) /* TaoALMM solver */ > | > | > |--------> This calls the TaoALMM subsolver routine > > TaoSolve(subsolver) > | > | > |-----------> The subsolver does not correctly work, at least with an Armijo line search, since the solution is overwritten within the line search. > In my case, the subsolver does not make any progress although it is possible. > > To get to my real problem you can simply change line 268 to if(0) (from if(1) -----> if(0)) and line 317 from // ierr = TaoSolve(tao); CHKERRQ(ierr); -------> ierr = TaoSolve(tao); CHKERRQ(ierr); > What you can see is that the solver does not make any progress, but it should make progress. > > To be honest, I do not really know why the option -tao_almm_subsolver_tao_ls_monitor has know effect if the ALMM solver is called and not the subsolver. I also do not know why -tao_almm_subsolver_tao_view prints as termination reason for the subsolver > > Solution converged: ||g(X)|| <= gatol > > This is obviously not the case. I set the tolerance > -tao_almm_subsolver_tao_gatol 1e-8 \ > -tao_almm_subsolver_tao_grtol 1e-8 \ This is because TaoSolve_ALMM adaptively sets the tolerances for the subsolver /* update subsolver tolerance */ PetscCall(PetscInfo(tao, "Subsolver tolerance: ||G|| <= %e\n", (double)auglag->gtol)); PetscCall(TaoSetTolerances(auglag->subsolver, auglag->gtol, 0.0, 0.0)); So any values one set initially are ignored. Unfortunately, given the organization of TaoSetFromOptions() as a general tool, there is no way to have ALMM not accept the command line tolerances, producing a message that the end that they have been ignored. Hence the user thinks they have been set and gets confused that they seem to be ignored. I don't see any way to prevent this confusion cleanly. I am still digging through all the nesting here. > > I encountered this and then I looked into the ALMM class and therefore I tried to call the subsolver (previous example). > > I attach the updated programm and also the options. > > Stephan > > > > > > > On 03.11.22 22:15, Barry Smith wrote: >> >> Thanks for your response and the code. I understand the potential problem and how your code demonstrates a bug if the TaoALMMSubsolverObjective() is used in the manner you use in the example where you directly call TaoComputeObjective() multiple times line a line search code might. >> >> What I don't have or understand is how to reproduce the problem in a real code that uses Tao. That is where the Tao Armijo line search code has a problem when it is used (somehow) in a Tao solver with ALMM. You suggest "If you have an example for your own, you can switch the Armijo line search by the option -tao_ls_type armijo. The thing is that it will cause no problems if the line search accepts the steps with step length one." I don't see how to do this if I use -tao_type almm I cannot use -tao_ls_type armijo; that is the option -tao_ls_type doesn't seem to me to be usable in the context of almm (since almm internally does directly its own trust region approach for globalization). If we remove the if (1) code from your example, is there some Tao options I can use to get the bug to appear inside the Tao solve? >> >> I'll try to explain again, I agree that the fact that the Tao solution is aliased (within the ALMM solver) is a problem with repeated calls to TaoComputeObjective() but I cannot see how these repeated calls could ever happen in the use of TaoSolve() with the ALMM solver. That is when is this "design problem" a true problem as opposed to just a potential problem that can be demonstrated in artificial code? >> >> The reason I need to understand the non-artificial situation it breaks things is to come up with an appropriate correction for the current code. >> >> Barry >> >> >> >> >> >> >> >>> On Nov 3, 2022, at 12:46 PM, Stephan K?hler wrote: >>> >>> Barry, >>> >>> so far, I have not experimented with trust-region methods, but I can imagine that this "design feature" causes no problem for trust-region methods, if the old point is saved and after the trust-region check fails the old point is copied to the actual point. But the implementation of the Armijo line search method does not work that way. Here, the actual point will always be overwritten. Only if the line search fails, then the old point is restored, but then the TaoSolve method ends with a line search failure. >>> >>> If you have an example for your own, you can switch the Armijo line search by the option -tao_ls_type armijo. The thing is that it will cause no problems if the line search accepts the steps with step length one. >>> It is also possible that, by luck, it will cause no problems, if the "excessive" step brings a reduction of the objective >>> >>> Otherwise, I attach my example, which is not minimal, but here you can see that it causes problems. You need to set the paths to the PETSc library in the makefile. You find the options for this problem in the run_test_tao_neohooke.sh script. >>> The import part begins at line 292 in test_tao_neohooke.cpp >>> >>> Stephan >>> >>> On 02.11.22 19:04, Barry Smith wrote: >>>> Stephan, >>>> >>>> I have located the troublesome line in TaoSetUp_ALMM() it has the line >>>> >>>> auglag->Px = tao->solution; >>>> >>>> and in alma.h it has >>>> >>>> Vec Px, LgradX, Ce, Ci, G; /* aliased vectors (do not destroy!) */ >>>> >>>> Now auglag->P in some situations alias auglag->P and in some cases auglag->Px serves to hold a portion of auglag->P. So then in TaoALMMSubsolverObjective_Private() >>>> the lines >>>> >>>> PetscCall(VecCopy(P, auglag->P)); >>>> PetscCall((*auglag->sub_obj)(auglag->parent)); >>>> >>>> causes, just as you said, tao->solution to be overwritten by the P at which the objective function is being computed. In other words, the solution of the outer Tao is aliased with the solution of the inner Tao, by design. >>>> >>>> You are definitely correct, the use of TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private in a line search would be problematic. >>>> >>>> I am not an expert at these methods or their implementations. Could you point to an actual use case within Tao that triggers the problem. Is there a set of command line options or code calls to Tao that fail due to this "design feature". Within the standard use of ALMM I do not see how the objective function would be used within a line search. The TaoSolve_ALMM() code is self-correcting in that if a trust region check fails it automatically rolls back the solution. >>>> >>>> Barry >>>> >>>> >>>> >>>> >>>>> On Oct 28, 2022, at 4:27 AM, Stephan K?hler wrote: >>>>> >>>>> Dear PETSc/Tao team, >>>>> >>>>> it seems to be that there is a bug in the TaoALMM class: >>>>> >>>>> In the methods TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private the vector where the function value for the augmented Lagrangian is evaluate >>>>> is copied into the current solution, see, e.g., https://petsc.org/release/src/tao/constrained/impls/almm/almm.c.html line 672 or 682. This causes subsolver routine to not converge if the line search for the subsolver rejects the step length 1. for some >>>>> update. In detail: >>>>> >>>>> Suppose the current iterate is xk and the current update is dxk. The line search evaluates the augmented Lagrangian now at (xk + dxk). This causes that the value (xk + dxk) is copied in the current solution. If the point (xk + dxk) is rejected, the line search should >>>>> try the point (xk + alpha * dxk), where alpha < 1. But due to the copying, what happens is that the point ((xk + dxk) + alpha * dxk) is evaluated, see, e.g., https://petsc.org/release/src/tao/linesearch/impls/armijo/armijo.c.html line 191. >>>>> >>>>> Best regards >>>>> Stephan K?hler >>>>> >>>>> -- >>>>> Stephan K?hler >>>>> TU Bergakademie Freiberg >>>>> Institut f?r numerische Mathematik und Optimierung >>>>> >>>>> Akademiestra?e 6 >>>>> 09599 Freiberg >>>>> Geb?udeteil Mittelbau, Zimmer 2.07 >>>>> >>>>> Telefon: +49 (0)3731 39-3173 (B?ro) >>>>> >>>>> >>> >>> -- >>> Stephan K?hler >>> TU Bergakademie Freiberg >>> Institut f?r numerische Mathematik und Optimierung >>> >>> Akademiestra?e 6 >>> 09599 Freiberg >>> Geb?udeteil Mittelbau, Zimmer 2.07 >>> >>> Telefon: +49 (0)3731 39-3173 (B?ro) >>> >> > > -- > Stephan K?hler > TU Bergakademie Freiberg > Institut f?r numerische Mathematik und Optimierung > > Akademiestra?e 6 > 09599 Freiberg > Geb?udeteil Mittelbau, Zimmer 2.07 > > Telefon: +49 (0)3731 39-3173 (B?ro) > -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Fri Nov 11 13:12:55 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Fri, 11 Nov 2022 20:12:55 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: Hi guys, anyone with any suggestion to make this thing work? ?? Il Ven 11 Nov 2022, 10:24 Edoardo alinovi ha scritto: > Hi Barry, > > FYI, in test.F90 I noted that "ui" starts for 1 and not from 0. I fixed it > but the situation does not change much. I attached the new file in this > email. > > Thanks for the support, I am really walking in the dark! > > Il giorno gio 10 nov 2022 alle ore 23:28 Edoardo alinovi < > edoardo.alinovi at gmail.com> ha scritto: > >> Ah I see you have already added the missing interfaces for fortran >> enthusiasts :) So you likely do not need Matt's hack! >> >> [image: image.png] >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 34126 bytes Desc: not available URL: From alexlindsay239 at gmail.com Fri Nov 11 16:56:59 2022 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Fri, 11 Nov 2022 14:56:59 -0800 Subject: [petsc-users] Use of LSC preconditioning Message-ID: Under what conditions can I use LSC preconditioning for field split problems with Schur? Let's imagine that all I've done is called SNESetJacobian with global A and P and provided the index sets for 0 and 1. Based off of the documentation on the man page https://petsc.org/release/docs/manualpages/PC/PCLSC/ it seems like I'd have to do something more programmatically, e.g. PetscObjectCompose((PetscObject)Sp,"LSC_L",(PetscObject)L); PetscObjectCompose((PetscObject)Sp,"LSC_Lp",(PetscObject)Lp); If I try to naively get LSC preconditioning from the command line without doing something like the above, then I get SUBPC_ERROR. Is this to be expected? 0 Nonlinear |R| = 4.164062e-02 0 Linear |R| = 4.164062e-02 Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0 PC failed due to SUBPC_ERROR -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Nov 11 16:59:13 2022 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 11 Nov 2022 17:59:13 -0500 Subject: [petsc-users] Use of LSC preconditioning In-Reply-To: References: Message-ID: On Fri, Nov 11, 2022 at 5:57 PM Alexander Lindsay wrote: > Under what conditions can I use LSC preconditioning for field split > problems with Schur? Let's imagine that all I've done is called > SNESetJacobian with global A and P and provided the index sets for 0 and 1. > Based off of the documentation on the man page > https://petsc.org/release/docs/manualpages/PC/PCLSC/ it seems like I'd > have to do something more programmatically, e.g. > > PetscObjectCompose((PetscObject)Sp,"LSC_L",(PetscObject)L); > PetscObjectCompose((PetscObject)Sp,"LSC_Lp",(PetscObject)Lp); > > If I try to naively get LSC preconditioning from the command line without > doing something like the above, then I get SUBPC_ERROR. Is this to be > expected? > Yes. PETSc does not have information about the operators or discretizations, and thus it cannot make auxiliary operators like the pressure Laplacian. Thanks, Matt > 0 Nonlinear |R| = 4.164062e-02 > 0 Linear |R| = 4.164062e-02 > Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0 > PC failed due to SUBPC_ERROR > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Fri Nov 11 17:06:42 2022 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Fri, 11 Nov 2022 15:06:42 -0800 Subject: [petsc-users] Use of LSC preconditioning In-Reply-To: References: Message-ID: That makes sense. Thanks for the quick reply! On Fri, Nov 11, 2022 at 2:59 PM Matthew Knepley wrote: > On Fri, Nov 11, 2022 at 5:57 PM Alexander Lindsay < > alexlindsay239 at gmail.com> wrote: > >> Under what conditions can I use LSC preconditioning for field split >> problems with Schur? Let's imagine that all I've done is called >> SNESetJacobian with global A and P and provided the index sets for 0 and 1. >> Based off of the documentation on the man page >> https://petsc.org/release/docs/manualpages/PC/PCLSC/ it seems like I'd >> have to do something more programmatically, e.g. >> >> PetscObjectCompose((PetscObject)Sp,"LSC_L",(PetscObject)L); >> PetscObjectCompose((PetscObject)Sp,"LSC_Lp",(PetscObject)Lp); >> >> If I try to naively get LSC preconditioning from the command line without >> doing something like the above, then I get SUBPC_ERROR. Is this to be >> expected? >> > > Yes. PETSc does not have information about the operators or > discretizations, and thus it cannot make auxiliary operators like the > pressure Laplacian. > > Thanks, > > Matt > > >> 0 Nonlinear |R| = 4.164062e-02 >> 0 Linear |R| = 4.164062e-02 >> Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0 >> PC failed due to SUBPC_ERROR >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Fri Nov 11 22:44:13 2022 From: bsmith at petsc.dev (Barry Smith) Date: Fri, 11 Nov 2022 23:44:13 -0500 Subject: [petsc-users] Report Bug TaoALMM class In-Reply-To: <892a51c2-17f7-ac1f-f55d-05981978a4f4@math.tu-freiberg.de> References: <4eec06f9-d534-7a02-9abe-6d1415f663f0@math.tu-freiberg.de> <14f2cdd6-9cbe-20a6-0c7d-3006b2ee4dc1@math.tu-freiberg.de> <5E53FE56-5C68-4F06-8A48-54ACBDC800C7@petsc.dev> <892a51c2-17f7-ac1f-f55d-05981978a4f4@math.tu-freiberg.de> Message-ID: I am still working to understand. I have a PETSc branch barry/2022-11-11/fixes-for-tao/release where I have made a few fix/improvements to help me run and debug with your code. I made a tiny change to your code, passing Hessian twice, and ran with ./test_tao_neohooke -tao_monitor -tao_view -tao_max_it 500 -tao_converged_reason -tao_lmvm_recycle -tao_type nls -tao_ls_monitor and got 18 TAO, Function value: -0.0383888, Residual: 7.46748e-11 TAO solve converged due to CONVERGED_GATOL iterations 18 Is this what you expect? Also works with ntr If I run with ./test_tao_neohooke -tao_monitor -tao_view -tao_max_it 10000 -tao_converged_reason -tao_type lmvm -tao_ls_monitor I get 2753 TAO, Function value: -0.0161685, Residual: 0.120782 0 LS Function value: -0.0161685, Step length: 0. 1 LS Function value: 4.49423e+307, Step length: 1. stx: 0., fx: -0.0161685, dgx: -0.0145883 sty: 0., fy: -0.0161685, dgy: -0.0145883 2 LS Function value: -0.0161685, Step length: 0. stx: 0., fx: -0.0161685, dgx: -0.0145883 sty: 1., fy: 4.49423e+307, dgy: 5.68594e+307 2754 TAO, Function value: -0.0161685, Residual: 0.120782 TAO solve did not converge due to DIVERGED_LS_FAILURE iteration 2754 Note the insane fy value that pops up at the end. The next one ./test_tao_neohooke -tao_monitor -tao_view -tao_max_it 500 -tao_converged_reason -tao_lmvm_recycle -tao_type owlqn -tao_ls_monitor 0 TAO, Function value: 0., Residual: 0. TAO solve converged due to CONVERGED_GATOL iterations 0 fails right off the bat, somehow the initial residual norm is 0, which should not depend on the solver (maybe a bug in Tao?) bmrm gets stuck far from the minimum found by the Newton methods. 1719 TAO, Function value: -2.36706e-06, Residual: 1.94494e-09 I realize this is still far from the problem you reported (and I agree is a bug), I am working to understand enough to provide a proper fix to the bug instead of just doing something ad hoc. Barry > On Nov 4, 2022, at 7:43 AM, Stephan K?hler wrote: > > Barry, > > this is a nonartificial code. This is a problem in the ALMM subsolver. I want to solve a problem with a TaoALMM solver what then happens is: > > TaoSolve(tao) /* TaoALMM solver */ > | > | > |--------> This calls the TaoALMM subsolver routine > > TaoSolve(subsolver) > | > | > |-----------> The subsolver does not correctly work, at least with an Armijo line search, since the solution is overwritten within the line search. > In my case, the subsolver does not make any progress although it is possible. > > To get to my real problem you can simply change line 268 to if(0) (from if(1) -----> if(0)) and line 317 from // ierr = TaoSolve(tao); CHKERRQ(ierr); -------> ierr = TaoSolve(tao); CHKERRQ(ierr); > What you can see is that the solver does not make any progress, but it should make progress. > > To be honest, I do not really know why the option -tao_almm_subsolver_tao_ls_monitor has know effect if the ALMM solver is called and not the subsolver. I also do not know why -tao_almm_subsolver_tao_view prints as termination reason for the subsolver > > Solution converged: ||g(X)|| <= gatol > > This is obviously not the case. I set the tolerance > -tao_almm_subsolver_tao_gatol 1e-8 \ > -tao_almm_subsolver_tao_grtol 1e-8 \ > > I encountered this and then I looked into the ALMM class and therefore I tried to call the subsolver (previous example). > > I attach the updated programm and also the options. > > Stephan > > > > > > > On 03.11.22 22:15, Barry Smith wrote: >> >> Thanks for your response and the code. I understand the potential problem and how your code demonstrates a bug if the TaoALMMSubsolverObjective() is used in the manner you use in the example where you directly call TaoComputeObjective() multiple times line a line search code might. >> >> What I don't have or understand is how to reproduce the problem in a real code that uses Tao. That is where the Tao Armijo line search code has a problem when it is used (somehow) in a Tao solver with ALMM. You suggest "If you have an example for your own, you can switch the Armijo line search by the option -tao_ls_type armijo. The thing is that it will cause no problems if the line search accepts the steps with step length one." I don't see how to do this if I use -tao_type almm I cannot use -tao_ls_type armijo; that is the option -tao_ls_type doesn't seem to me to be usable in the context of almm (since almm internally does directly its own trust region approach for globalization). If we remove the if (1) code from your example, is there some Tao options I can use to get the bug to appear inside the Tao solve? >> >> I'll try to explain again, I agree that the fact that the Tao solution is aliased (within the ALMM solver) is a problem with repeated calls to TaoComputeObjective() but I cannot see how these repeated calls could ever happen in the use of TaoSolve() with the ALMM solver. That is when is this "design problem" a true problem as opposed to just a potential problem that can be demonstrated in artificial code? >> >> The reason I need to understand the non-artificial situation it breaks things is to come up with an appropriate correction for the current code. >> >> Barry >> >> >> >> >> >> >> >>> On Nov 3, 2022, at 12:46 PM, Stephan K?hler wrote: >>> >>> Barry, >>> >>> so far, I have not experimented with trust-region methods, but I can imagine that this "design feature" causes no problem for trust-region methods, if the old point is saved and after the trust-region check fails the old point is copied to the actual point. But the implementation of the Armijo line search method does not work that way. Here, the actual point will always be overwritten. Only if the line search fails, then the old point is restored, but then the TaoSolve method ends with a line search failure. >>> >>> If you have an example for your own, you can switch the Armijo line search by the option -tao_ls_type armijo. The thing is that it will cause no problems if the line search accepts the steps with step length one. >>> It is also possible that, by luck, it will cause no problems, if the "excessive" step brings a reduction of the objective >>> >>> Otherwise, I attach my example, which is not minimal, but here you can see that it causes problems. You need to set the paths to the PETSc library in the makefile. You find the options for this problem in the run_test_tao_neohooke.sh script. >>> The import part begins at line 292 in test_tao_neohooke.cpp >>> >>> Stephan >>> >>> On 02.11.22 19:04, Barry Smith wrote: >>>> Stephan, >>>> >>>> I have located the troublesome line in TaoSetUp_ALMM() it has the line >>>> >>>> auglag->Px = tao->solution; >>>> >>>> and in alma.h it has >>>> >>>> Vec Px, LgradX, Ce, Ci, G; /* aliased vectors (do not destroy!) */ >>>> >>>> Now auglag->P in some situations alias auglag->P and in some cases auglag->Px serves to hold a portion of auglag->P. So then in TaoALMMSubsolverObjective_Private() >>>> the lines >>>> >>>> PetscCall(VecCopy(P, auglag->P)); >>>> PetscCall((*auglag->sub_obj)(auglag->parent)); >>>> >>>> causes, just as you said, tao->solution to be overwritten by the P at which the objective function is being computed. In other words, the solution of the outer Tao is aliased with the solution of the inner Tao, by design. >>>> >>>> You are definitely correct, the use of TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private in a line search would be problematic. >>>> >>>> I am not an expert at these methods or their implementations. Could you point to an actual use case within Tao that triggers the problem. Is there a set of command line options or code calls to Tao that fail due to this "design feature". Within the standard use of ALMM I do not see how the objective function would be used within a line search. The TaoSolve_ALMM() code is self-correcting in that if a trust region check fails it automatically rolls back the solution. >>>> >>>> Barry >>>> >>>> >>>> >>>> >>>>> On Oct 28, 2022, at 4:27 AM, Stephan K?hler wrote: >>>>> >>>>> Dear PETSc/Tao team, >>>>> >>>>> it seems to be that there is a bug in the TaoALMM class: >>>>> >>>>> In the methods TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private the vector where the function value for the augmented Lagrangian is evaluate >>>>> is copied into the current solution, see, e.g., https://petsc.org/release/src/tao/constrained/impls/almm/almm.c.html line 672 or 682. This causes subsolver routine to not converge if the line search for the subsolver rejects the step length 1. for some >>>>> update. In detail: >>>>> >>>>> Suppose the current iterate is xk and the current update is dxk. The line search evaluates the augmented Lagrangian now at (xk + dxk). This causes that the value (xk + dxk) is copied in the current solution. If the point (xk + dxk) is rejected, the line search should >>>>> try the point (xk + alpha * dxk), where alpha < 1. But due to the copying, what happens is that the point ((xk + dxk) + alpha * dxk) is evaluated, see, e.g., https://petsc.org/release/src/tao/linesearch/impls/armijo/armijo.c.html line 191. >>>>> >>>>> Best regards >>>>> Stephan K?hler >>>>> >>>>> -- >>>>> Stephan K?hler >>>>> TU Bergakademie Freiberg >>>>> Institut f?r numerische Mathematik und Optimierung >>>>> >>>>> Akademiestra?e 6 >>>>> 09599 Freiberg >>>>> Geb?udeteil Mittelbau, Zimmer 2.07 >>>>> >>>>> Telefon: +49 (0)3731 39-3173 (B?ro) >>>>> >>>>> >>> >>> -- >>> Stephan K?hler >>> TU Bergakademie Freiberg >>> Institut f?r numerische Mathematik und Optimierung >>> >>> Akademiestra?e 6 >>> 09599 Freiberg >>> Geb?udeteil Mittelbau, Zimmer 2.07 >>> >>> Telefon: +49 (0)3731 39-3173 (B?ro) >>> >> > > -- > Stephan K?hler > TU Bergakademie Freiberg > Institut f?r numerische Mathematik und Optimierung > > Akademiestra?e 6 > 09599 Freiberg > Geb?udeteil Mittelbau, Zimmer 2.07 > > Telefon: +49 (0)3731 39-3173 (B?ro) > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Fri Nov 11 23:00:11 2022 From: bsmith at petsc.dev (Barry Smith) Date: Sat, 12 Nov 2022 00:00:11 -0500 Subject: [petsc-users] Report Bug TaoALMM class In-Reply-To: <892a51c2-17f7-ac1f-f55d-05981978a4f4@math.tu-freiberg.de> References: <4eec06f9-d534-7a02-9abe-6d1415f663f0@math.tu-freiberg.de> <14f2cdd6-9cbe-20a6-0c7d-3006b2ee4dc1@math.tu-freiberg.de> <5E53FE56-5C68-4F06-8A48-54ACBDC800C7@petsc.dev> <892a51c2-17f7-ac1f-f55d-05981978a4f4@math.tu-freiberg.de> Message-ID: <2E5E8937-D739-4CFB-9A21-28DFBF5791B5@petsc.dev> I noticed this in the TAOALMM manual page. It is also highly recommended that the subsolver chosen by the user utilize a trust-region strategy for globalization (default: TAOBQNKTR) especially if the outer problem features bound constraints. I am far from an expert on these topics. > On Nov 4, 2022, at 7:43 AM, Stephan K?hler wrote: > > Barry, > > this is a nonartificial code. This is a problem in the ALMM subsolver. I want to solve a problem with a TaoALMM solver what then happens is: > > TaoSolve(tao) /* TaoALMM solver */ > | > | > |--------> This calls the TaoALMM subsolver routine > > TaoSolve(subsolver) > | > | > |-----------> The subsolver does not correctly work, at least with an Armijo line search, since the solution is overwritten within the line search. > In my case, the subsolver does not make any progress although it is possible. > > To get to my real problem you can simply change line 268 to if(0) (from if(1) -----> if(0)) and line 317 from // ierr = TaoSolve(tao); CHKERRQ(ierr); -------> ierr = TaoSolve(tao); CHKERRQ(ierr); > What you can see is that the solver does not make any progress, but it should make progress. > > To be honest, I do not really know why the option -tao_almm_subsolver_tao_ls_monitor has know effect if the ALMM solver is called and not the subsolver. I also do not know why -tao_almm_subsolver_tao_view prints as termination reason for the subsolver > > Solution converged: ||g(X)|| <= gatol > > This is obviously not the case. I set the tolerance > -tao_almm_subsolver_tao_gatol 1e-8 \ > -tao_almm_subsolver_tao_grtol 1e-8 \ > > I encountered this and then I looked into the ALMM class and therefore I tried to call the subsolver (previous example). > > I attach the updated programm and also the options. > > Stephan > > > > > > > On 03.11.22 22:15, Barry Smith wrote: >> >> Thanks for your response and the code. I understand the potential problem and how your code demonstrates a bug if the TaoALMMSubsolverObjective() is used in the manner you use in the example where you directly call TaoComputeObjective() multiple times line a line search code might. >> >> What I don't have or understand is how to reproduce the problem in a real code that uses Tao. That is where the Tao Armijo line search code has a problem when it is used (somehow) in a Tao solver with ALMM. You suggest "If you have an example for your own, you can switch the Armijo line search by the option -tao_ls_type armijo. The thing is that it will cause no problems if the line search accepts the steps with step length one." I don't see how to do this if I use -tao_type almm I cannot use -tao_ls_type armijo; that is the option -tao_ls_type doesn't seem to me to be usable in the context of almm (since almm internally does directly its own trust region approach for globalization). If we remove the if (1) code from your example, is there some Tao options I can use to get the bug to appear inside the Tao solve? >> >> I'll try to explain again, I agree that the fact that the Tao solution is aliased (within the ALMM solver) is a problem with repeated calls to TaoComputeObjective() but I cannot see how these repeated calls could ever happen in the use of TaoSolve() with the ALMM solver. That is when is this "design problem" a true problem as opposed to just a potential problem that can be demonstrated in artificial code? >> >> The reason I need to understand the non-artificial situation it breaks things is to come up with an appropriate correction for the current code. >> >> Barry >> >> >> >> >> >> >> >>> On Nov 3, 2022, at 12:46 PM, Stephan K?hler wrote: >>> >>> Barry, >>> >>> so far, I have not experimented with trust-region methods, but I can imagine that this "design feature" causes no problem for trust-region methods, if the old point is saved and after the trust-region check fails the old point is copied to the actual point. But the implementation of the Armijo line search method does not work that way. Here, the actual point will always be overwritten. Only if the line search fails, then the old point is restored, but then the TaoSolve method ends with a line search failure. >>> >>> If you have an example for your own, you can switch the Armijo line search by the option -tao_ls_type armijo. The thing is that it will cause no problems if the line search accepts the steps with step length one. >>> It is also possible that, by luck, it will cause no problems, if the "excessive" step brings a reduction of the objective >>> >>> Otherwise, I attach my example, which is not minimal, but here you can see that it causes problems. You need to set the paths to the PETSc library in the makefile. You find the options for this problem in the run_test_tao_neohooke.sh script. >>> The import part begins at line 292 in test_tao_neohooke.cpp >>> >>> Stephan >>> >>> On 02.11.22 19:04, Barry Smith wrote: >>>> Stephan, >>>> >>>> I have located the troublesome line in TaoSetUp_ALMM() it has the line >>>> >>>> auglag->Px = tao->solution; >>>> >>>> and in alma.h it has >>>> >>>> Vec Px, LgradX, Ce, Ci, G; /* aliased vectors (do not destroy!) */ >>>> >>>> Now auglag->P in some situations alias auglag->P and in some cases auglag->Px serves to hold a portion of auglag->P. So then in TaoALMMSubsolverObjective_Private() >>>> the lines >>>> >>>> PetscCall(VecCopy(P, auglag->P)); >>>> PetscCall((*auglag->sub_obj)(auglag->parent)); >>>> >>>> causes, just as you said, tao->solution to be overwritten by the P at which the objective function is being computed. In other words, the solution of the outer Tao is aliased with the solution of the inner Tao, by design. >>>> >>>> You are definitely correct, the use of TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private in a line search would be problematic. >>>> >>>> I am not an expert at these methods or their implementations. Could you point to an actual use case within Tao that triggers the problem. Is there a set of command line options or code calls to Tao that fail due to this "design feature". Within the standard use of ALMM I do not see how the objective function would be used within a line search. The TaoSolve_ALMM() code is self-correcting in that if a trust region check fails it automatically rolls back the solution. >>>> >>>> Barry >>>> >>>> >>>> >>>> >>>>> On Oct 28, 2022, at 4:27 AM, Stephan K?hler wrote: >>>>> >>>>> Dear PETSc/Tao team, >>>>> >>>>> it seems to be that there is a bug in the TaoALMM class: >>>>> >>>>> In the methods TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private the vector where the function value for the augmented Lagrangian is evaluate >>>>> is copied into the current solution, see, e.g., https://petsc.org/release/src/tao/constrained/impls/almm/almm.c.html line 672 or 682. This causes subsolver routine to not converge if the line search for the subsolver rejects the step length 1. for some >>>>> update. In detail: >>>>> >>>>> Suppose the current iterate is xk and the current update is dxk. The line search evaluates the augmented Lagrangian now at (xk + dxk). This causes that the value (xk + dxk) is copied in the current solution. If the point (xk + dxk) is rejected, the line search should >>>>> try the point (xk + alpha * dxk), where alpha < 1. But due to the copying, what happens is that the point ((xk + dxk) + alpha * dxk) is evaluated, see, e.g., https://petsc.org/release/src/tao/linesearch/impls/armijo/armijo.c.html line 191. >>>>> >>>>> Best regards >>>>> Stephan K?hler >>>>> >>>>> -- >>>>> Stephan K?hler >>>>> TU Bergakademie Freiberg >>>>> Institut f?r numerische Mathematik und Optimierung >>>>> >>>>> Akademiestra?e 6 >>>>> 09599 Freiberg >>>>> Geb?udeteil Mittelbau, Zimmer 2.07 >>>>> >>>>> Telefon: +49 (0)3731 39-3173 (B?ro) >>>>> >>>>> >>> >>> -- >>> Stephan K?hler >>> TU Bergakademie Freiberg >>> Institut f?r numerische Mathematik und Optimierung >>> >>> Akademiestra?e 6 >>> 09599 Freiberg >>> Geb?udeteil Mittelbau, Zimmer 2.07 >>> >>> Telefon: +49 (0)3731 39-3173 (B?ro) >>> >> > > -- > Stephan K?hler > TU Bergakademie Freiberg > Institut f?r numerische Mathematik und Optimierung > > Akademiestra?e 6 > 09599 Freiberg > Geb?udeteil Mittelbau, Zimmer 2.07 > > Telefon: +49 (0)3731 39-3173 (B?ro) > -------------- next part -------------- An HTML attachment was scrubbed... URL: From t.hisch at gmail.com Mon Nov 14 06:43:22 2022 From: t.hisch at gmail.com (Thomas Hisch) Date: Mon, 14 Nov 2022 13:43:22 +0100 Subject: [petsc-users] Use Vectors and Scalars in MatNest matrices Message-ID: Hi! I would like to ask if it is possible to create a MatNest matrix that contains matrices, vectors and scalars. My (jacobian) matrix looks like the following [[A, B, a, b], [C, D, c, d], [e, f, g, h]] where A-D are sparse square matrices, a-d are dense column vectors, e, f are sparse row vectors and g,h are just scalars. Is there also a petsc4py example that shows how nest objects should be created and how they should be used? Best regards Thomas From knepley at gmail.com Mon Nov 14 08:06:38 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 14 Nov 2022 09:06:38 -0500 Subject: [petsc-users] Use Vectors and Scalars in MatNest matrices In-Reply-To: References: Message-ID: On Mon, Nov 14, 2022 at 8:59 AM Thomas Hisch wrote: > Hi! > > I would like to ask if it is possible to create a MatNest matrix that > contains matrices, vectors and scalars. > No, You would turn everything into a type of matrix, but you can get the effect you want. Vectors can be easily used to create dense matrices, or vice versa. The scalars would also be dense matrices (and can share storage). Sparse vectors are just sparse matrices in PETSc. Thanks, Matt > My (jacobian) matrix looks like the following > > [[A, B, a, b], > [C, D, c, d], > [e, f, g, h]] > > where A-D are sparse square matrices, a-d are dense column vectors, e, > f are sparse row vectors and g,h are just scalars. > > Is there also a petsc4py example that shows how nest objects should be > created and how they should be used? > > Best regards > Thomas > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Nov 14 08:05:12 2022 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 14 Nov 2022 09:05:12 -0500 Subject: [petsc-users] Use Vectors and Scalars in MatNest matrices In-Reply-To: References: Message-ID: Not exactly. But you can make a Mat for each. Mark On Mon, Nov 14, 2022 at 8:59 AM Thomas Hisch wrote: > Hi! > > I would like to ask if it is possible to create a MatNest matrix that > contains matrices, vectors and scalars. > > My (jacobian) matrix looks like the following > > [[A, B, a, b], > [C, D, c, d], > [e, f, g, h]] > > where A-D are sparse square matrices, a-d are dense column vectors, e, > f are sparse row vectors and g,h are just scalars. > > Is there also a petsc4py example that shows how nest objects should be > created and how they should be used? > > Best regards > Thomas > -------------- next part -------------- An HTML attachment was scrubbed... URL: From t.hisch at gmail.com Mon Nov 14 10:07:58 2022 From: t.hisch at gmail.com (Thomas Hisch) Date: Mon, 14 Nov 2022 17:07:58 +0100 Subject: [petsc-users] Use Vectors and Scalars in MatNest matrices In-Reply-To: References: Message-ID: On Mon, Nov 14, 2022 at 3:06 PM Matthew Knepley wrote: > > On Mon, Nov 14, 2022 at 8:59 AM Thomas Hisch wrote: >> >> Hi! >> >> I would like to ask if it is possible to create a MatNest matrix that >> contains matrices, vectors and scalars. > > > No, You would turn everything into a type of matrix, but you can get the effect you want. > Vectors can be easily used to create dense matrices, or vice versa. The scalars would also > be dense matrices (and can share storage). Sparse vectors are just sparse matrices in > PETSc. Good. I'll work on an example and will then ask you if I'm using the api in the correct way. Thx > > Thanks, > > Matt > >> >> My (jacobian) matrix looks like the following >> >> [[A, B, a, b], >> [C, D, c, d], >> [e, f, g, h]] >> >> where A-D are sparse square matrices, a-d are dense column vectors, e, >> f are sparse row vectors and g,h are just scalars. >> >> Is there also a petsc4py example that shows how nest objects should be >> created and how they should be used? >> >> Best regards >> Thomas > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ From facklerpw at ornl.gov Mon Nov 14 12:13:33 2022 From: facklerpw at ornl.gov (Fackler, Philip) Date: Mon, 14 Nov 2022 18:13:33 +0000 Subject: [petsc-users] Kokkos backend for Mat and Vec diverging when running on CUDA device. Message-ID: This is an issue I've brought up before (and discussed in-person with Richard). I wanted to bring it up again because I'm hitting the limits of what I know to do, and I need help figuring this out. The problem can be reproduced using Xolotl's "develop" branch built against a petsc build with kokkos and kokkos-kernels enabled. Then, either add the relevant kokkos options to the "petscArgs=" line in the system test parameter file(s), or just replace the system test parameter files with the ones from the "feature-petsc-kokkos" branch. See here the files that begin with "params_system_". Note that those files use the "kokkos" options, but the problem is similar using the corresponding cuda/cusparse options. I've already tried building kokkos-kernels with no TPLs and got slightly different results, but the same problem. Any help would be appreciated. Thanks, Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory -------------- next part -------------- An HTML attachment was scrubbed... URL: From facklerpw at ornl.gov Mon Nov 14 12:24:19 2022 From: facklerpw at ornl.gov (Fackler, Philip) Date: Mon, 14 Nov 2022 18:24:19 +0000 Subject: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases Message-ID: In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use the COO interface for preallocating and setting values in the Jacobian matrix. I have found that with some of our test cases, using more than one MPI rank results in a crash. Way down in the preconditioner code in petsc a Mat gets computed that has "null" for the "productsymbolic" member of its "ops". It's pretty far removed from where we compute the Jacobian entries, so I haven't been able (so far) to track it back to an error in my code. I'd appreciate some help with this from someone who is more familiar with the petsc guts so we can figure out what I'm doing wrong. (I'm assuming it's a bug in Xolotl.) Note that this is using the kokkos backend for Mat and Vec in petsc, but with a serial-only build of kokkos and kokkos-kernels. So, it's a CPU-only multiple MPI rank run. Here's a paste of the error output showing the relevant parts of the call stack: [ERROR] [0]PETSC ERROR: [ERROR] --------------------- Error Message -------------------------------------------------------------- [ERROR] [1]PETSC ERROR: [ERROR] --------------------- Error Message -------------------------------------------------------------- [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] No support for this operation for this object type [ERROR] [1]PETSC ERROR: [ERROR] No support for this operation for this object type [ERROR] [0]PETSC ERROR: [ERROR] No method productsymbolic for Mat of type (null) [ERROR] No method productsymbolic for Mat of type (null) [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] See https://petsc.org/release/faq/ for trouble shooting. [ERROR] See https://petsc.org/release/faq/ for trouble shooting. [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT Date: 2022-10-28 14:39:41 +0000 [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT Date: 2022-10-28 14:39:41 +0000 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-cudac=0 --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices --with-shared-libraries --with-kokkos-dir=/home/4pf/build/kokkos/serial/install --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-cudac=0 --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices --with-shared-libraries --with-kokkos-dir=/home/4pf/build/kokkos/serial/install --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #3 MatProductSymbolic() at /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 [ERROR] #3 MatProductSymbolic() at /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #4 MatProduct_Private() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 [ERROR] #4 MatProduct_Private() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #5 MatMatMult() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 [ERROR] #5 MatMatMult() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #6 PCGAMGOptProlongator_AGG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 [ERROR] #6 PCGAMGOptProlongator_AGG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #7 PCSetUp_GAMG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [ERROR] #7 PCSetUp_GAMG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #8 PCSetUp() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 [ERROR] #8 PCSetUp() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #9 KSPSetUp() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 [ERROR] #9 KSPSetUp() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #10 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 [ERROR] #10 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #11 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] #11 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #12 PCApply_FieldSplit() at /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 [ERROR] #12 PCApply_FieldSplit() at /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #13 PCApply() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 [ERROR] #13 PCApply() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #14 KSP_PCApply() at /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 [ERROR] #14 KSP_PCApply() at /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #15 KSPFGMRESCycle() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 [ERROR] #15 KSPFGMRESCycle() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #16 KSPSolve_FGMRES() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 [ERROR] #16 KSPSolve_FGMRES() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #17 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 [ERROR] #17 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #18 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] #18 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #19 SNESSolve_NEWTONLS() at /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 [ERROR] #19 SNESSolve_NEWTONLS() at /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #20 SNESSolve() at /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 [ERROR] #20 SNESSolve() at /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #21 TSStep_ARKIMEX() at /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 [ERROR] #21 TSStep_ARKIMEX() at /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 [ERROR] PetscSolver::solve: TSSolve failed. [ERROR] PetscSolver::solve: TSSolve failed. Aborting. Aborting. Thanks for the help, Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Mon Nov 14 13:13:04 2022 From: bsmith at petsc.dev (Barry Smith) Date: Mon, 14 Nov 2022 14:13:04 -0500 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: Very sorry for wasting so much of your time. The PCFIELDSPLIT generally will not work with BAIJ matrices because the MatCreateSubMatrix() for BAIJ requires indexing by block in the matrix. Your code should work if you use MPIAIJ matrices. Note you can still use MatSetValuesBlocked() with MPAIJ matrices. Barry > On Nov 10, 2022, at 5:24 PM, Edoardo alinovi wrote: > > True, > > Maybe somebody merged it already? I have attached my silly example. > > To compile: > mpifort -L$PETSC_DIR/$PETSC_ARCH/lib -lpetsc -fdefault-real-8 -o test test.F90 -I$PETSC_DIR/include -I$PETSC_DIR/$PETSC_ARCH/include > > Do you need the petsc code MAtt did? > From bsmith at petsc.dev Mon Nov 14 13:33:49 2022 From: bsmith at petsc.dev (Barry Smith) Date: Mon, 14 Nov 2022 14:33:49 -0500 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: Message-ID: <59032746-0A26-40CE-BCE1-FF74932B27EA@petsc.dev> Can you clarify what you mean? For some classes of problems, PCFIELDSPLIT can be a very efficacious preconditioner; for example when certain fields have very different mathematical structure than others. In those cases it is worth using AIJ and PCFIELDSPLIT instead of keeping BAIJ. > On Nov 14, 2022, at 2:21 PM, Edoardo alinovi wrote: > > Hi Barry no worries! > > Thanks for letting me know! It is not a problem for me to use MPIAIJ, do you think field split will be a game changer? > > > > Il Lun 14 Nov 2022, 20:13 Barry Smith > ha scritto: >> >> Very sorry for wasting so much of your time. The PCFIELDSPLIT generally will not work with BAIJ matrices because the MatCreateSubMatrix() for BAIJ requires indexing by block in the matrix. Your code should work if you use MPIAIJ matrices. Note you can still use MatSetValuesBlocked() with MPAIJ matrices. >> >> Barry >> >> >> > On Nov 10, 2022, at 5:24 PM, Edoardo alinovi > wrote: >> > >> > True, >> > >> > Maybe somebody merged it already? I have attached my silly example. >> > >> > To compile: >> > mpifort -L$PETSC_DIR/$PETSC_ARCH/lib -lpetsc -fdefault-real-8 -o test test.F90 -I$PETSC_DIR/include -I$PETSC_DIR/$PETSC_ARCH/include >> > >> > Do you need the petsc code MAtt did? >> > >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Mon Nov 14 13:58:16 2022 From: bsmith at petsc.dev (Barry Smith) Date: Mon, 14 Nov 2022 14:58:16 -0500 Subject: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases In-Reply-To: References: Message-ID: <42C7B5E0-FCDD-4040-9595-87645C1983F4@petsc.dev> Mat of type (null) Either the entire matrix (header) data structure has gotten corrupted or the matrix type was never set. Can you run with valgrind to see if there is any memory corruption? > On Nov 14, 2022, at 1:24 PM, Fackler, Philip via petsc-users wrote: > > In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use the COO interface for preallocating and setting values in the Jacobian matrix. I have found that with some of our test cases, using more than one MPI rank results in a crash. Way down in the preconditioner code in petsc a Mat gets computed that has "null" for the "productsymbolic" member of its "ops". It's pretty far removed from where we compute the Jacobian entries, so I haven't been able (so far) to track it back to an error in my code. I'd appreciate some help with this from someone who is more familiar with the petsc guts so we can figure out what I'm doing wrong. (I'm assuming it's a bug in Xolotl.) > > Note that this is using the kokkos backend for Mat and Vec in petsc, but with a serial-only build of kokkos and kokkos-kernels. So, it's a CPU-only multiple MPI rank run. > > Here's a paste of the error output showing the relevant parts of the call stack: > > [ERROR] [0]PETSC ERROR: > [ERROR] --------------------- Error Message -------------------------------------------------------------- > [ERROR] [1]PETSC ERROR: > [ERROR] --------------------- Error Message -------------------------------------------------------------- > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] No support for this operation for this object type > [ERROR] [1]PETSC ERROR: > [ERROR] No support for this operation for this object type > [ERROR] [0]PETSC ERROR: > [ERROR] No method productsymbolic for Mat of type (null) > [ERROR] No method productsymbolic for Mat of type (null) > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] See https://petsc.org/release/faq/ for trouble shooting. > [ERROR] See https://petsc.org/release/faq/ for trouble shooting. > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT Date: 2022-10-28 14:39:41 +0000 > [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT Date: 2022-10-28 14:39:41 +0000 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 > [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-cudac=0 --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices --with-shared-libraries --with-kokkos-dir=/home/4pf/build/kokkos/serial/install --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install > [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-cudac=0 --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices --with-shared-libraries --with-kokkos-dir=/home/4pf/build/kokkos/serial/install --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 > [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 > [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #3 MatProductSymbolic() at /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 > [ERROR] #3 MatProductSymbolic() at /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #4 MatProduct_Private() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 > [ERROR] #4 MatProduct_Private() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #5 MatMatMult() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 > [ERROR] #5 MatMatMult() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #6 PCGAMGOptProlongator_AGG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 > [ERROR] #6 PCGAMGOptProlongator_AGG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #7 PCSetUp_GAMG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 > [ERROR] #7 PCSetUp_GAMG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #8 PCSetUp() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 > [ERROR] #8 PCSetUp() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #9 KSPSetUp() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 > [ERROR] #9 KSPSetUp() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #10 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 > [ERROR] #10 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #11 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] #11 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #12 PCApply_FieldSplit() at /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 > [ERROR] #12 PCApply_FieldSplit() at /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #13 PCApply() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 > [ERROR] #13 PCApply() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #14 KSP_PCApply() at /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 > [ERROR] #14 KSP_PCApply() at /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #15 KSPFGMRESCycle() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 > [ERROR] #15 KSPFGMRESCycle() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #16 KSPSolve_FGMRES() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 > [ERROR] #16 KSPSolve_FGMRES() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #17 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 > [ERROR] #17 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #18 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] #18 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #19 SNESSolve_NEWTONLS() at /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 > [ERROR] #19 SNESSolve_NEWTONLS() at /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #20 SNESSolve() at /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 > [ERROR] #20 SNESSolve() at /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #21 TSStep_ARKIMEX() at /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 > [ERROR] #21 TSStep_ARKIMEX() at /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 > [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 > [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 > [ERROR] PetscSolver::solve: TSSolve failed. > [ERROR] PetscSolver::solve: TSSolve failed. > Aborting. > Aborting. > > > > Thanks for the help, > > Philip Fackler > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > Oak Ridge National Laboratory -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Mon Nov 14 14:46:15 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Mon, 14 Nov 2022 21:46:15 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: <59032746-0A26-40CE-BCE1-FF74932B27EA@petsc.dev> References: <59032746-0A26-40CE-BCE1-FF74932B27EA@petsc.dev> Message-ID: Thanks Barry, Your help is always much appreciated! I'll try this out asap. I have ended using baij because I have read the section "solving block matrices" and I was thinking that baij was the only way to use fieldsplit! Completely misunderstood then! Il Lun 14 Nov 2022, 20:34 Barry Smith ha scritto: > > Can you clarify what you mean? For some classes of problems, > PCFIELDSPLIT can be a very efficacious preconditioner; for example when > certain fields have very different mathematical structure than others. In > those cases it is worth using AIJ and PCFIELDSPLIT instead of keeping BAIJ. > > > On Nov 14, 2022, at 2:21 PM, Edoardo alinovi > wrote: > > Hi Barry no worries! > > Thanks for letting me know! It is not a problem for me to use MPIAIJ, do > you think field split will be a game changer? > > > > Il Lun 14 Nov 2022, 20:13 Barry Smith ha scritto: > >> >> Very sorry for wasting so much of your time. The PCFIELDSPLIT >> generally will not work with BAIJ matrices because the MatCreateSubMatrix() >> for BAIJ requires indexing by block in the matrix. Your code should work if >> you use MPIAIJ matrices. Note you can still use MatSetValuesBlocked() with >> MPAIJ matrices. >> >> Barry >> >> >> > On Nov 10, 2022, at 5:24 PM, Edoardo alinovi >> wrote: >> > >> > True, >> > >> > Maybe somebody merged it already? I have attached my silly example. >> > >> > To compile: >> > mpifort -L$PETSC_DIR/$PETSC_ARCH/lib -lpetsc -fdefault-real-8 -o test >> test.F90 -I$PETSC_DIR/include -I$PETSC_DIR/$PETSC_ARCH/include >> > >> > Do you need the petsc code MAtt did? >> > >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bourdin at mcmaster.ca Mon Nov 14 15:19:47 2022 From: bourdin at mcmaster.ca (Blaise Bourdin) Date: Mon, 14 Nov 2022 21:19:47 +0000 Subject: [petsc-users] Reference element in DMPlexComputeCellGeometryAffineFEM In-Reply-To: References: <1223C850-C305-4475-BBF0-F907C8739C1C@mcmaster.ca> <26D68FF6-8BE4-4A8C-B0C3-BC8FE84A43FC@mcmaster.ca> Message-ID: <21D97E56-7334-4FC9-9B4A-11D0DD404DCB@mcmaster.ca> An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Nov 14 17:00:01 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 14 Nov 2022 18:00:01 -0500 Subject: [petsc-users] Reference element in DMPlexComputeCellGeometryAffineFEM In-Reply-To: <21D97E56-7334-4FC9-9B4A-11D0DD404DCB@mcmaster.ca> References: <1223C850-C305-4475-BBF0-F907C8739C1C@mcmaster.ca> <26D68FF6-8BE4-4A8C-B0C3-BC8FE84A43FC@mcmaster.ca> <21D97E56-7334-4FC9-9B4A-11D0DD404DCB@mcmaster.ca> Message-ID: On Mon, Nov 14, 2022 at 4:19 PM Blaise Bourdin wrote: > Replying to myself so that it gets into the googles: > > The reference tetrahedron used by DMPlexComputeCellGeometryAffineFEM has > vertices at (-1,1,-1), (-1,-1,-1), (1, -1, -1) and (-1,-1,1). > It is (-1, -1, -1) -- (-1, 1, -1) -- (1, -1, -1) -- (-1, -1, 1) that way the first face has an outward normal. Matt > Blaise > > > On Nov 10, 2022, at 6:42 PM, Matthew Knepley wrote: > > On Thu, Nov 10, 2022 at 3:46 PM Blaise Bourdin > wrote: > >> I am not sure I am buying this? If the tet was inverted, detJ would be >> negative, but it is always 1/8, as expected. >> >> The attached mesh is a perfectly valid tet generated by Cubit, with >> orientation matching the exodus documentation (ignore the mid-edge dof >> since this is a tet4). >> Here is what I get out of the code I attached in my previous email: >> > > Yes, I use the opposite convention from ExodusII. In my opinion, orienting > face (1, 2, 3) to have an inward normal is sacrilegious. > > Thanks, > > Matt > > >> *SiMini*:Tests (dmplex)$ ./TestDMPlexComputeCellGeometryAffineFEM -i >> ../TestMeshes/TetCubit.gen >> filename ../TestMeshes/TetCubit.gen >> Vec Object: coordinates 1 MPI process >> type: seq >> 0. >> 0. >> 0. >> 1. >> 0. >> 0. >> 0. >> 1. >> 0. >> 0. >> 0. >> 1. >> v0 >> 0: 1.0000e+00 0.0000e+00 0.0000e+00 >> J >> 0: -5.0000e-01 -5.0000e-01 -5.0000e-01 >> 0: 5.0000e-01 0.0000e+00 0.0000e+00 >> 0: 0.0000e+00 0.0000e+00 5.0000e-01 >> invJ >> 0: 0.0000e+00 2.0000e+00 0.0000e+00 >> 0: -2.0000e+00 -2.0000e+00 -2.0000e+00 >> 0: 0.0000e+00 0.0000e+00 2.0000e+00 >> detJ : 0.125 >> >> From J, invJ, and v0, I still can?t reconstruct a reasonable reference >> tet which I was naively assuming was either the unit simplex, or the >> simplex with vertices at (-1,-1,-1), (-1,0,-1), (0, -1, -1), and (-1,-1,1) >> not necessarily in this order. In order to build my FE basis functions on >> the reference element, I really need to know what this element is? >> >> Blaise >> >> >> >> >> On Nov 9, 2022, at 6:56 PM, Matthew Knepley wrote: >> >> On Wed, Nov 9, 2022 at 10:46 AM Blaise Bourdin >> wrote: >> >> >> >> On Nov 9, 2022, at 10:04 AM, Matthew Knepley wrote: >> >> On Tue, Nov 8, 2022 at 9:14 PM Blaise Bourdin >> wrote: >> >> Hi, >> >> What reference simplex is DMPlexComputeCellGeometryAffineFEM using in 2 >> and 3D? >> I am used to computing my shape functions on the unit simplex (vertices >> at the origin and each e_i), but it does not look to be the reference >> simplex in this function: >> >> In 3D, for the unit simplex with vertices at (0,0,0) (1,0,0) (0,1,0) >> (0,0,1) (in this order), I get J = 1 / 2 . [[-1,-1,-1],[1,0,0],[0,0,1]] and >> v0 = [0,0,1] >> >> In 2D, for the unit simplex with vertices at (0,0), (1,0), and (0,1), I >> get J = 1 / 2. I and v0 = [0,0], which does not make any sense to me (I was >> assuming that the 2D reference simplex had vertices at (-1,-1), (1, -1) and >> (-1,1), but if this were the case, v0 would not be 0). >> >> I can build a simple example with meshes consisting only of the unit >> simplex in 2D and 3D if that would help. >> >> >> I need to rewrite the documentation on geometry, but I was waiting until >> I rewrite the geometry calculations to fit into libCEED. Toby found a nice >> way to express them in BLAS form which I need to push through everything. >> >> I always think of operating on the cell with the first vertex at the >> origin (I think it is easier), so I have a xi0 that translates the first >> vertex >> of the reference to the origin, and a v0 that translates the first vertex >> of the real cell to the origin. You can see this here >> >> >> https://gitlab.com/petsc/petsc/-/blob/main/include/petsc/private/petscfeimpl.h#L251 >> >> This explains the 2D result. I cannot understand your 3D result, unless >> the vertices are in another order. >> >> >> That makes two of us, then? I am attaching a small example and test >> meshes (one cell being the unit simplex starting with the origin and >> numbered in direct order when looking from (1,1,1) >> >> >> Oh, it is probably inverted. All faces are oriented for outward normals. >> It is in the Orientation chapter in the book :) >> >> Thanks, >> >> Matt >> >> >> filename ../TestMeshes/1Tri.gen >> Vec Object: coordinates 1 MPI process >> type: seq >> 0. >> 0. >> 1. >> 0. >> 0. >> 1. >> v0 >> 0: 0.0000e+00 0.0000e+00 >> J >> 0: 5.0000e-01 0.0000e+00 >> 0: 0.0000e+00 5.0000e-01 >> invJ >> 0: 2.0000e+00 -0.0000e+00 >> 0: -0.0000e+00 2.0000e+00 >> detJ : 0.25 >> >> And >> filename ../TestMeshes/1Tet.gen >> Vec Object: coordinates 1 MPI process >> type: seq >> 0. >> 0. >> 0. >> 1. >> 0. >> >> 0. >> 0. >> 1. >> 0. >> 0. >> 0. >> 1. >> v0 >> 0: 1.0000e+00 0.0000e+00 0.0000e+00 >> J >> 0: -5.0000e-01 -5.0000e-01 -5.0000e-01 >> 0: 5.0000e-01 0.0000e+00 0.0000e+00 >> 0: 0.0000e+00 0.0000e+00 5.0000e-01 >> invJ >> 0: 0.0000e+00 2.0000e+00 0.0000e+00 >> 0: -2.0000e+00 -2.0000e+00 -2.0000e+00 >> 0: 0.0000e+00 0.0000e+00 2.0000e+00 >> detJ : 0.125 >> >> I don?t understand why v0=(0,0) in 2D and (1,0,0) in 3D (but don?t really >> care) since I only want J. J makes no sense to me in 3D. In particular, one >> does not seem to have X~ = invJ.X + v0 (X = J.(X~-v0) as stated in >> CoordinatesRefToReal (it works in 2D if V0 = (1,1), which is consistent >> with a reference simplex with vertices at (-1,-1), (1,-1) and (-1,1)). >> >> What am I missing? >> >> Blaise >> >> / >> >> >> >> Thanks, >> >> Matt >> >> >> Regards, >> Blaise >> >> >> >> ? >> Canada Research Chair in Mathematical and Computational Aspects of Solid >> Mechanics (Tier 1) >> Professor, Department of Mathematics & Statistics >> Hamilton Hall room 409A, McMaster University >> 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada >> https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243 >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> >> >> ? >> Canada Research Chair in Mathematical and Computational Aspects of Solid >> Mechanics (Tier 1) >> Professor, Department of Mathematics & Statistics >> Hamilton Hall room 409A, McMaster University >> 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada >> https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243 >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> >> >> ? >> Canada Research Chair in Mathematical and Computational Aspects of Solid >> Mechanics (Tier 1) >> Professor, Department of Mathematics & Statistics >> Hamilton Hall room 409A, McMaster University >> 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada >> https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243 >> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > ? > Canada Research Chair in Mathematical and Computational Aspects of Solid > Mechanics (Tier 1) > Professor, Department of Mathematics & Statistics > Hamilton Hall room 409A, McMaster University > 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada > https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Mon Nov 14 18:34:02 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Mon, 14 Nov 2022 18:34:02 -0600 Subject: [petsc-users] Kokkos backend for Mat and Vec diverging when running on CUDA device. In-Reply-To: References: Message-ID: Hi, Philip, Sorry to hear that. It seems you could run the same code on CPUs but not no GPUs (with either petsc/Kokkos backend or petsc/cuda backend, is it right? --Junchao Zhang On Mon, Nov 14, 2022 at 12:13 PM Fackler, Philip via petsc-users < petsc-users at mcs.anl.gov> wrote: > This is an issue I've brought up before (and discussed in-person with > Richard). I wanted to bring it up again because I'm hitting the limits of > what I know to do, and I need help figuring this out. > > The problem can be reproduced using Xolotl's "develop" branch built > against a petsc build with kokkos and kokkos-kernels enabled. Then, either > add the relevant kokkos options to the "petscArgs=" line in the system test > parameter file(s), or just replace the system test parameter files with the > ones from the "feature-petsc-kokkos" branch. See here > > the files that begin with "params_system_". > > Note that those files use the "kokkos" options, but the problem is similar > using the corresponding cuda/cusparse options. I've already tried building > kokkos-kernels with no TPLs and got slightly different results, but the > same problem. > > Any help would be appreciated. > > Thanks, > > > *Philip Fackler * > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > *Oak Ridge National Laboratory* > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mhyaqteen at sju.ac.kr Mon Nov 14 18:39:04 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Tue, 15 Nov 2022 00:39:04 +0000 Subject: [petsc-users] Reading Vectors from a PETSc Vec Message-ID: I am using the following procedure to read from Vec, but it keeps giving me the same values! I was told that using VecGetValues gives wrog output. If not this, then what function should be used to read the contents of a vector? for (int i = 0; i < nconv; i++) { PetscCall(EPSGetEigenpair(eps,i,&kr,&ki,xr,xi)); PetscCall(EPSComputeError(eps,i,EPS_ERROR_RELATIVE,&error1)); #if defined(PETSC_USE_COMPLEX) re = PetscRealPart(kr); im = PetscImaginaryPart(kr); #else re = kr; im = ki; #endif if (im!=0.0) PetscCall(PetscPrintf(PETSC_COMM_WORLD," %9f%+9fi %12g\n",(double)re,(double)im,(double)error1)); else PetscCall(PetscPrintf(PETSC_COMM_WORLD," %12f %12g\n",(double)re,(double)error1)); eval(i) = re; VecGetValues(xr, tdof, ei, eveci); for (int j = 0; j < tdof; j++) { evec(j, i) = eveci[j]; } } PetscCall(PetscPrintf(PETSC_COMM_WORLD,"\n")); Thank you Ali -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Nov 14 18:43:41 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 14 Nov 2022 19:43:41 -0500 Subject: [petsc-users] Reading Vectors from a PETSc Vec In-Reply-To: References: Message-ID: On Mon, Nov 14, 2022 at 7:39 PM Mohammad Ali Yaqteen wrote: > I am using the following procedure to read from Vec, but it keeps giving > me the same values! I was told that using VecGetValues gives wrog output. > By who? It does not give the wrong output. You do not show where in the code you define tdof and ei[]. If not this, then what function should be used to read the contents of a > vector? > I think it would be simpler for you to use VecGetArrayRead(), unless you want values from other processes. Thanks, Matt > for (int i = 0; i < nconv; i++) > { > PetscCall(EPSGetEigenpair(eps,i,&kr,&ki,xr,xi)); > PetscCall(EPSComputeError(eps,i,EPS_ERROR_RELATIVE,&error1)); > > #if defined(PETSC_USE_COMPLEX) > re = PetscRealPart(kr); > im = PetscImaginaryPart(kr); > #else > re = kr; > im = ki; > #endif > if (im!=0.0) PetscCall(PetscPrintf(PETSC_COMM_WORLD," %9f%+9fi > %12g\n",(double)re,(double)im,(double)error1)); > else PetscCall(PetscPrintf(PETSC_COMM_WORLD," %12f > %12g\n",(double)re,(double)error1)); > eval(i) = re; > VecGetValues(xr, tdof, ei, eveci); > for (int j = 0; j < tdof; j++) > { > evec(j, i) = eveci[j]; > } > } > PetscCall(PetscPrintf(PETSC_COMM_WORLD,"\n")); > > Thank you > Ali > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mhyaqteen at sju.ac.kr Mon Nov 14 18:50:32 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Tue, 15 Nov 2022 00:50:32 +0000 Subject: [petsc-users] Reading Vectors from a PETSc Vec In-Reply-To: References: Message-ID: I am using Eigen library to which I have to write these vector values from PETSc Vec. tdof is the length of the vector that I need and ei is the number of value in as an index: PetscInt ei[tdof]; PetscScalar eveci[tdof]; for (int i = 0; i < tdof; i++) ei[i] = i; if (nconv>0) { eval.setZero(nconv); evec.setZero(KS.rows(),nconv); PetscCall(PetscPrintf(PETSC_COMM_WORLD, " k ||Ax-kx||/||kx||\n" " ----------------- ------------------\n")); for (int i = 0; i < nconv; i++) { PetscCall(EPSGetEigenpair(eps,i,&kr,&ki,xr,xi)); PetscCall(EPSComputeError(eps,i,EPS_ERROR_RELATIVE,&error1)); #if defined(PETSC_USE_COMPLEX) re = PetscRealPart(kr); im = PetscImaginaryPart(kr); #else re = kr; im = ki; #endif if (im!=0.0) PetscCall(PetscPrintf(PETSC_COMM_WORLD," %9f%+9fi %12g\n",(double)re,(double)im,(double)error1)); else PetscCall(PetscPrintf(PETSC_COMM_WORLD," %12f %12g\n",(double)re,(double)error1)); eval(i) = re; VecGetValues(xr, tdof, ei, eveci); for (int j = 0; j < tdof; j++) { evec(j, i) = eveci[j]; } } PetscCall(PetscPrintf(PETSC_COMM_WORLD,"\n")); } Thank you Ali ________________________________ From: Matthew Knepley Sent: Tuesday, November 15, 2022 9:43 AM To: Mohammad Ali Yaqteen Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Reading Vectors from a PETSc Vec On Mon, Nov 14, 2022 at 7:39 PM Mohammad Ali Yaqteen > wrote: I am using the following procedure to read from Vec, but it keeps giving me the same values! I was told that using VecGetValues gives wrog output. By who? It does not give the wrong output. You do not show where in the code you define tdof and ei[]. If not this, then what function should be used to read the contents of a vector? I think it would be simpler for you to use VecGetArrayRead(), unless you want values from other processes. Thanks, Matt for (int i = 0; i < nconv; i++) { PetscCall(EPSGetEigenpair(eps,i,&kr,&ki,xr,xi)); PetscCall(EPSComputeError(eps,i,EPS_ERROR_RELATIVE,&error1)); #if defined(PETSC_USE_COMPLEX) re = PetscRealPart(kr); im = PetscImaginaryPart(kr); #else re = kr; im = ki; #endif if (im!=0.0) PetscCall(PetscPrintf(PETSC_COMM_WORLD," %9f%+9fi %12g\n",(double)re,(double)im,(double)error1)); else PetscCall(PetscPrintf(PETSC_COMM_WORLD," %12f %12g\n",(double)re,(double)error1)); eval(i) = re; VecGetValues(xr, tdof, ei, eveci); for (int j = 0; j < tdof; j++) { evec(j, i) = eveci[j]; } } PetscCall(PetscPrintf(PETSC_COMM_WORLD,"\n")); Thank you Ali -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Nov 14 18:53:44 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 14 Nov 2022 19:53:44 -0500 Subject: [petsc-users] Reading Vectors from a PETSc Vec In-Reply-To: References: Message-ID: On Mon, Nov 14, 2022 at 7:50 PM Mohammad Ali Yaqteen wrote: > I am using Eigen library to which I have to write these vector values from > PETSc Vec. tdof is the length of the vector that I need and ei is the > number of value in as an index: > > PetscInt ei[tdof]; > PetscScalar eveci[tdof]; > > for (int i = 0; i < tdof; i++) > ei[i] = i; > If you are running in serial, just use VecGetArrayRead(). Thanks, Matt > if (nconv>0) > { > eval.setZero(nconv); > evec.setZero(KS.rows(),nconv); > PetscCall(PetscPrintf(PETSC_COMM_WORLD, > " k ||Ax-kx||/||kx||\n" > " ----------------- ------------------\n")); > > for (int i = 0; i < nconv; i++) > { > PetscCall(EPSGetEigenpair(eps,i,&kr,&ki,xr,xi)); > PetscCall(EPSComputeError(eps,i,EPS_ERROR_RELATIVE,&error1)); > > #if defined(PETSC_USE_COMPLEX) > re = PetscRealPart(kr); > im = PetscImaginaryPart(kr); > #else > re = kr; > im = ki; > #endif > if (im!=0.0) PetscCall(PetscPrintf(PETSC_COMM_WORLD," %9f%+9fi > %12g\n",(double)re,(double)im,(double)error1)); > else PetscCall(PetscPrintf(PETSC_COMM_WORLD," %12f > %12g\n",(double)re,(double)error1)); > eval(i) = re; > VecGetValues(xr, tdof, ei, eveci); > for (int j = 0; j < tdof; j++) > { > evec(j, i) = eveci[j]; > } > } > PetscCall(PetscPrintf(PETSC_COMM_WORLD,"\n")); > } > > Thank you > Ali > ------------------------------ > *From:* Matthew Knepley > *Sent:* Tuesday, November 15, 2022 9:43 AM > *To:* Mohammad Ali Yaqteen > *Cc:* petsc-users at mcs.anl.gov > *Subject:* Re: [petsc-users] Reading Vectors from a PETSc Vec > > On Mon, Nov 14, 2022 at 7:39 PM Mohammad Ali Yaqteen > wrote: > > I am using the following procedure to read from Vec, but it keeps giving > me the same values! I was told that using VecGetValues gives wrog output. > > > By who? It does not give the wrong output. > > You do not show where in the code you define tdof and ei[]. > > If not this, then what function should be used to read the contents of a > vector? > > > I think it would be simpler for you to use VecGetArrayRead(), unless you > want values from other processes. > > Thanks, > > Matt > > > for (int i = 0; i < nconv; i++) > { > PetscCall(EPSGetEigenpair(eps,i,&kr,&ki,xr,xi)); > PetscCall(EPSComputeError(eps,i,EPS_ERROR_RELATIVE,&error1)); > > #if defined(PETSC_USE_COMPLEX) > re = PetscRealPart(kr); > im = PetscImaginaryPart(kr); > #else > re = kr; > im = ki; > #endif > if (im!=0.0) PetscCall(PetscPrintf(PETSC_COMM_WORLD," %9f%+9fi > %12g\n",(double)re,(double)im,(double)error1)); > else PetscCall(PetscPrintf(PETSC_COMM_WORLD," %12f > %12g\n",(double)re,(double)error1)); > eval(i) = re; > VecGetValues(xr, tdof, ei, eveci); > for (int j = 0; j < tdof; j++) > { > evec(j, i) = eveci[j]; > } > } > PetscCall(PetscPrintf(PETSC_COMM_WORLD,"\n")); > > Thank you > Ali > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mhyaqteen at sju.ac.kr Mon Nov 14 20:05:11 2022 From: mhyaqteen at sju.ac.kr (Mohammad Ali Yaqteen) Date: Tue, 15 Nov 2022 02:05:11 +0000 Subject: [petsc-users] Reading Vectors from a PETSc Vec In-Reply-To: References: Message-ID: I am sorry for the trouble but I don?t understand its usage. Like I want to read Vec xr. I am giving it to a pointer array. How would I access the values of the vector? Could you please briefly explain? From: Matthew Knepley Sent: Tuesday, November 15, 2022 9:54 AM To: Mohammad Ali Yaqteen Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Reading Vectors from a PETSc Vec On Mon, Nov 14, 2022 at 7:50 PM Mohammad Ali Yaqteen > wrote: I am using Eigen library to which I have to write these vector values from PETSc Vec. tdof is the length of the vector that I need and ei is the number of value in as an index: PetscInt ei[tdof]; PetscScalar eveci[tdof]; for (int i = 0; i < tdof; i++) ei[i] = i; If you are running in serial, just use VecGetArrayRead(). Thanks, Matt if (nconv>0) { eval.setZero(nconv); evec.setZero(KS.rows(),nconv); PetscCall(PetscPrintf(PETSC_COMM_WORLD, " k ||Ax-kx||/||kx||\n" " ----------------- ------------------\n")); for (int i = 0; i < nconv; i++) { PetscCall(EPSGetEigenpair(eps,i,&kr,&ki,xr,xi)); PetscCall(EPSComputeError(eps,i,EPS_ERROR_RELATIVE,&error1)); #if defined(PETSC_USE_COMPLEX) re = PetscRealPart(kr); im = PetscImaginaryPart(kr); #else re = kr; im = ki; #endif if (im!=0.0) PetscCall(PetscPrintf(PETSC_COMM_WORLD," %9f%+9fi %12g\n",(double)re,(double)im,(double)error1)); else PetscCall(PetscPrintf(PETSC_COMM_WORLD," %12f %12g\n",(double)re,(double)error1)); eval(i) = re; VecGetValues(xr, tdof, ei, eveci); for (int j = 0; j < tdof; j++) { evec(j, i) = eveci[j]; } } PetscCall(PetscPrintf(PETSC_COMM_WORLD,"\n")); } Thank you Ali ________________________________ From: Matthew Knepley > Sent: Tuesday, November 15, 2022 9:43 AM To: Mohammad Ali Yaqteen > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Reading Vectors from a PETSc Vec On Mon, Nov 14, 2022 at 7:39 PM Mohammad Ali Yaqteen > wrote: I am using the following procedure to read from Vec, but it keeps giving me the same values! I was told that using VecGetValues gives wrog output. By who? It does not give the wrong output. You do not show where in the code you define tdof and ei[]. If not this, then what function should be used to read the contents of a vector? I think it would be simpler for you to use VecGetArrayRead(), unless you want values from other processes. Thanks, Matt for (int i = 0; i < nconv; i++) { PetscCall(EPSGetEigenpair(eps,i,&kr,&ki,xr,xi)); PetscCall(EPSComputeError(eps,i,EPS_ERROR_RELATIVE,&error1)); #if defined(PETSC_USE_COMPLEX) re = PetscRealPart(kr); im = PetscImaginaryPart(kr); #else re = kr; im = ki; #endif if (im!=0.0) PetscCall(PetscPrintf(PETSC_COMM_WORLD," %9f%+9fi %12g\n",(double)re,(double)im,(double)error1)); else PetscCall(PetscPrintf(PETSC_COMM_WORLD," %12f %12g\n",(double)re,(double)error1)); eval(i) = re; VecGetValues(xr, tdof, ei, eveci); for (int j = 0; j < tdof; j++) { evec(j, i) = eveci[j]; } } PetscCall(PetscPrintf(PETSC_COMM_WORLD,"\n")); Thank you Ali -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Mon Nov 14 22:48:17 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Mon, 14 Nov 2022 22:48:17 -0600 Subject: [petsc-users] Reading Vectors from a PETSc Vec In-Reply-To: References: Message-ID: On Mon, Nov 14, 2022 at 8:05 PM Mohammad Ali Yaqteen wrote: > I am sorry for the trouble but I don?t understand its usage. Like I want > to read Vec xr. I am giving it to a pointer array. How would I access the > values of the vector? Could you please briefly explain? > Suppose you have a vector x with global size 10, and you run with 2 MPI ranks, then the code sketch would be const PetscScalar *a; VecGetArrayRead(x, &a); // code reading a[]. On rank 0, a[0]~[a4] contains x[0]~x[4]; on rank 1, a[0]~[a4] contains x[5]~x[9] VecRestoreArrayRead(x, &a); See manuals at https://petsc.org/release/docs/manualpages/Vec/VecGetArrayRead/, https://petsc.org/release/docs/manualpages/Vec/VecGetLocalSize/, and also examples there. > > > *From:* Matthew Knepley > *Sent:* Tuesday, November 15, 2022 9:54 AM > *To:* Mohammad Ali Yaqteen > *Cc:* petsc-users at mcs.anl.gov > *Subject:* Re: [petsc-users] Reading Vectors from a PETSc Vec > > > > On Mon, Nov 14, 2022 at 7:50 PM Mohammad Ali Yaqteen > wrote: > > I am using Eigen library to which I have to write these vector values from > PETSc Vec. tdof is the length of the vector that I need and ei is the > number of value in as an index: > > > > PetscInt ei[tdof]; > PetscScalar eveci[tdof]; > > > > for (int i = 0; i < tdof; i++) > ei[i] = i; > > > > If you are running in serial, just use VecGetArrayRead(). > > > > Thanks, > > > > Matt > > > > if (nconv>0) > > { > > eval.setZero(nconv); > > evec.setZero(KS.rows(),nconv); > > PetscCall(PetscPrintf(PETSC_COMM_WORLD, > > " k ||Ax-kx||/||kx||\n" > > " ----------------- ------------------\n")); > > > > for (int i = 0; i < nconv; i++) > > { > > PetscCall(EPSGetEigenpair(eps,i,&kr,&ki,xr,xi)); > > PetscCall(EPSComputeError(eps,i,EPS_ERROR_RELATIVE,&error1)); > > > > #if defined(PETSC_USE_COMPLEX) > > re = PetscRealPart(kr); > > im = PetscImaginaryPart(kr); > > #else > > re = kr; > > im = ki; > > #endif > > if (im!=0.0) PetscCall(PetscPrintf(PETSC_COMM_WORLD," %9f%+9fi > %12g\n",(double)re,(double)im,(double)error1)); > > else PetscCall(PetscPrintf(PETSC_COMM_WORLD," %12f > %12g\n",(double)re,(double)error1)); > > eval(i) = re; > > VecGetValues(xr, tdof, ei, eveci); > > for (int j = 0; j < tdof; j++) > > { > > evec(j, i) = eveci[j]; > > } > > } > > PetscCall(PetscPrintf(PETSC_COMM_WORLD,"\n")); > > } > > > > Thank you > > Ali > ------------------------------ > > *From:* Matthew Knepley > *Sent:* Tuesday, November 15, 2022 9:43 AM > *To:* Mohammad Ali Yaqteen > *Cc:* petsc-users at mcs.anl.gov > *Subject:* Re: [petsc-users] Reading Vectors from a PETSc Vec > > > > On Mon, Nov 14, 2022 at 7:39 PM Mohammad Ali Yaqteen > wrote: > > I am using the following procedure to read from Vec, but it keeps giving > me the same values! I was told that using VecGetValues gives wrog output. > > > > By who? It does not give the wrong output. > > > > You do not show where in the code you define tdof and ei[]. > > > > If not this, then what function should be used to read the contents of a > vector? > > > > I think it would be simpler for you to use VecGetArrayRead(), unless you > want values from other processes. > > > > Thanks, > > > > Matt > > > > for (int i = 0; i < nconv; i++) > > { > > PetscCall(EPSGetEigenpair(eps,i,&kr,&ki,xr,xi)); > > PetscCall(EPSComputeError(eps,i,EPS_ERROR_RELATIVE,&error1)); > > > > #if defined(PETSC_USE_COMPLEX) > > re = PetscRealPart(kr); > > im = PetscImaginaryPart(kr); > > #else > > re = kr; > > im = ki; > > #endif > > if (im!=0.0) PetscCall(PetscPrintf(PETSC_COMM_WORLD," %9f%+9fi > %12g\n",(double)re,(double)im,(double)error1)); > > else PetscCall(PetscPrintf(PETSC_COMM_WORLD," %12f > %12g\n",(double)re,(double)error1)); > > eval(i) = re; > > VecGetValues(xr, tdof, ei, eveci); > > for (int j = 0; j < tdof; j++) > > { > > evec(j, i) = eveci[j]; > > } > > } > > PetscCall(PetscPrintf(PETSC_COMM_WORLD,"\n")); > > > > Thank you > > Ali > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Mon Nov 14 23:16:50 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Mon, 14 Nov 2022 23:16:50 -0600 Subject: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases In-Reply-To: References: Message-ID: Hi, Philip, Can you tell me instructions to build Xolotl to reproduce the error? --Junchao Zhang On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users < petsc-users at mcs.anl.gov> wrote: > In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use > the COO interface for preallocating and setting values in the Jacobian > matrix. I have found that with some of our test cases, using more than one > MPI rank results in a crash. Way down in the preconditioner code in petsc a > Mat gets computed that has "null" for the "productsymbolic" member of its > "ops". It's pretty far removed from where we compute the Jacobian entries, > so I haven't been able (so far) to track it back to an error in my code. > I'd appreciate some help with this from someone who is more familiar with > the petsc guts so we can figure out what I'm doing wrong. (I'm assuming > it's a bug in Xolotl.) > > Note that this is using the kokkos backend for Mat and Vec in petsc, but > with a serial-only build of kokkos and kokkos-kernels. So, it's a CPU-only > multiple MPI rank run. > > Here's a paste of the error output showing the relevant parts of the call > stack: > > [ERROR] [0]PETSC ERROR: > [ERROR] --------------------- Error Message > -------------------------------------------------------------- > [ERROR] [1]PETSC ERROR: > [ERROR] --------------------- Error Message > -------------------------------------------------------------- > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] No support for this operation for this object type > [ERROR] [1]PETSC ERROR: > [ERROR] No support for this operation for this object type > [ERROR] [0]PETSC ERROR: > [ERROR] No method productsymbolic for Mat of type (null) > [ERROR] No method productsymbolic for Mat of type (null) > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] See https://petsc.org/release/faq/ for trouble shooting. > [ERROR] See https://petsc.org/release/faq/ for trouble shooting. > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT > Date: 2022-10-28 14:39:41 +0000 > [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT > Date: 2022-10-28 14:39:41 +0000 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 > [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc > PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc > --with-cxx=mpicxx --with-fc=0 --with-cudac=0 > --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices > --with-shared-libraries > --with-kokkos-dir=/home/4pf/build/kokkos/serial/install > --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install > [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc > PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc > --with-cxx=mpicxx --with-fc=0 --with-cudac=0 > --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices > --with-shared-libraries > --with-kokkos-dir=/home/4pf/build/kokkos/serial/install > --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 > [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 > [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #3 MatProductSymbolic() at > /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 > [ERROR] #3 MatProductSymbolic() at > /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #4 MatProduct_Private() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 > [ERROR] #4 MatProduct_Private() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #5 MatMatMult() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 > [ERROR] #5 MatMatMult() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #6 PCGAMGOptProlongator_AGG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 > [ERROR] #6 PCGAMGOptProlongator_AGG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #7 PCSetUp_GAMG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 > [ERROR] #7 PCSetUp_GAMG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #8 PCSetUp() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 > [ERROR] #8 PCSetUp() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #9 KSPSetUp() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 > [ERROR] #9 KSPSetUp() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #10 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 > [ERROR] #10 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #11 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] #11 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #12 PCApply_FieldSplit() at > /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 > [ERROR] #12 PCApply_FieldSplit() at > /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #13 PCApply() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 > [ERROR] #13 PCApply() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #14 KSP_PCApply() at > /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 > [ERROR] #14 KSP_PCApply() at > /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #15 KSPFGMRESCycle() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 > [ERROR] #15 KSPFGMRESCycle() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #16 KSPSolve_FGMRES() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 > [ERROR] #16 KSPSolve_FGMRES() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #17 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 > [ERROR] #17 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #18 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] #18 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #19 SNESSolve_NEWTONLS() at > /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 > [ERROR] #19 SNESSolve_NEWTONLS() at > /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #20 SNESSolve() at > /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 > [ERROR] #20 SNESSolve() at > /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #21 TSStep_ARKIMEX() at > /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 > [ERROR] #21 TSStep_ARKIMEX() at > /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 > [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 > [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 > [ERROR] PetscSolver::solve: TSSolve failed. > [ERROR] PetscSolver::solve: TSSolve failed. > Aborting. > Aborting. > > > > Thanks for the help, > > > *Philip Fackler * > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > *Oak Ridge National Laboratory* > -------------- next part -------------- An HTML attachment was scrubbed... URL: From facklerpw at ornl.gov Tue Nov 15 10:24:28 2022 From: facklerpw at ornl.gov (Fackler, Philip) Date: Tue, 15 Nov 2022 16:24:28 +0000 Subject: [petsc-users] [EXTERNAL] Re: Kokkos backend for Mat and Vec diverging when running on CUDA device. In-Reply-To: References: Message-ID: Yes, most (but not all) of our system test cases fail with the kokkos/cuda or cuda backends. All of them pass with the CPU-only kokkos backend. Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory ________________________________ From: Junchao Zhang Sent: Monday, November 14, 2022 19:34 To: Fackler, Philip Cc: xolotl-psi-development at lists.sourceforge.net ; petsc-users at mcs.anl.gov ; Blondel, Sophie ; Zhang, Junchao ; Roth, Philip Subject: [EXTERNAL] Re: [petsc-users] Kokkos backend for Mat and Vec diverging when running on CUDA device. Hi, Philip, Sorry to hear that. It seems you could run the same code on CPUs but not no GPUs (with either petsc/Kokkos backend or petsc/cuda backend, is it right? --Junchao Zhang On Mon, Nov 14, 2022 at 12:13 PM Fackler, Philip via petsc-users > wrote: This is an issue I've brought up before (and discussed in-person with Richard). I wanted to bring it up again because I'm hitting the limits of what I know to do, and I need help figuring this out. The problem can be reproduced using Xolotl's "develop" branch built against a petsc build with kokkos and kokkos-kernels enabled. Then, either add the relevant kokkos options to the "petscArgs=" line in the system test parameter file(s), or just replace the system test parameter files with the ones from the "feature-petsc-kokkos" branch. See here the files that begin with "params_system_". Note that those files use the "kokkos" options, but the problem is similar using the corresponding cuda/cusparse options. I've already tried building kokkos-kernels with no TPLs and got slightly different results, but the same problem. Any help would be appreciated. Thanks, Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory -------------- next part -------------- An HTML attachment was scrubbed... URL: From facklerpw at ornl.gov Tue Nov 15 10:55:26 2022 From: facklerpw at ornl.gov (Fackler, Philip) Date: Tue, 15 Nov 2022 16:55:26 +0000 Subject: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases In-Reply-To: References: Message-ID: I built petsc with: $ ./configure PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-debugging=0 --prefix=$HOME/build/petsc/debug/install --with-64-bit-indices --with-shared-libraries --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --download-kokkos --download-kokkos-kernels $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug all $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug install Then I build xolotl in a separate build directory (after checking out the "feature-petsc-kokkos" branch) with: $ cmake -DCMAKE_BUILD_TYPE=Debug -DKokkos_DIR=$HOME/build/petsc/debug/install -DPETSC_DIR=$HOME/build/petsc/debug/install $ make -j4 SystemTester Then, from the xolotl build directory, run (for example): $ mpirun -n 2 ./test/system/SystemTester -t System/NE_4 -- -v Note that this test case will use the parameter file '/benchmarks/params_system_NE_4.txt' which has the command-line arguments for petsc in its "petscArgs=..." line. If you look at '/test/system/SystemTester.cpp' all the system test cases follow the same naming convention with their corresponding parameter files under '/benchmarks'. The failure happens with the NE_4 case (which is 2D) and the PSI_3 case (which is 1D). Let me know if this is still unclear. Thanks, Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory ________________________________ From: Junchao Zhang Sent: Tuesday, November 15, 2022 00:16 To: Fackler, Philip Cc: petsc-users at mcs.anl.gov ; Blondel, Sophie Subject: [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases Hi, Philip, Can you tell me instructions to build Xolotl to reproduce the error? --Junchao Zhang On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users > wrote: In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use the COO interface for preallocating and setting values in the Jacobian matrix. I have found that with some of our test cases, using more than one MPI rank results in a crash. Way down in the preconditioner code in petsc a Mat gets computed that has "null" for the "productsymbolic" member of its "ops". It's pretty far removed from where we compute the Jacobian entries, so I haven't been able (so far) to track it back to an error in my code. I'd appreciate some help with this from someone who is more familiar with the petsc guts so we can figure out what I'm doing wrong. (I'm assuming it's a bug in Xolotl.) Note that this is using the kokkos backend for Mat and Vec in petsc, but with a serial-only build of kokkos and kokkos-kernels. So, it's a CPU-only multiple MPI rank run. Here's a paste of the error output showing the relevant parts of the call stack: [ERROR] [0]PETSC ERROR: [ERROR] --------------------- Error Message -------------------------------------------------------------- [ERROR] [1]PETSC ERROR: [ERROR] --------------------- Error Message -------------------------------------------------------------- [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] No support for this operation for this object type [ERROR] [1]PETSC ERROR: [ERROR] No support for this operation for this object type [ERROR] [0]PETSC ERROR: [ERROR] No method productsymbolic for Mat of type (null) [ERROR] No method productsymbolic for Mat of type (null) [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT Date: 2022-10-28 14:39:41 +0000 [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT Date: 2022-10-28 14:39:41 +0000 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-cudac=0 --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices --with-shared-libraries --with-kokkos-dir=/home/4pf/build/kokkos/serial/install --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-cudac=0 --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices --with-shared-libraries --with-kokkos-dir=/home/4pf/build/kokkos/serial/install --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #3 MatProductSymbolic() at /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 [ERROR] #3 MatProductSymbolic() at /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #4 MatProduct_Private() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 [ERROR] #4 MatProduct_Private() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #5 MatMatMult() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 [ERROR] #5 MatMatMult() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #6 PCGAMGOptProlongator_AGG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 [ERROR] #6 PCGAMGOptProlongator_AGG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #7 PCSetUp_GAMG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [ERROR] #7 PCSetUp_GAMG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #8 PCSetUp() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 [ERROR] #8 PCSetUp() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #9 KSPSetUp() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 [ERROR] #9 KSPSetUp() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #10 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 [ERROR] #10 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #11 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] #11 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #12 PCApply_FieldSplit() at /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 [ERROR] #12 PCApply_FieldSplit() at /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #13 PCApply() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 [ERROR] #13 PCApply() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #14 KSP_PCApply() at /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 [ERROR] #14 KSP_PCApply() at /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #15 KSPFGMRESCycle() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 [ERROR] #15 KSPFGMRESCycle() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #16 KSPSolve_FGMRES() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 [ERROR] #16 KSPSolve_FGMRES() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #17 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 [ERROR] #17 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #18 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] #18 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #19 SNESSolve_NEWTONLS() at /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 [ERROR] #19 SNESSolve_NEWTONLS() at /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #20 SNESSolve() at /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 [ERROR] #20 SNESSolve() at /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #21 TSStep_ARKIMEX() at /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 [ERROR] #21 TSStep_ARKIMEX() at /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 [ERROR] PetscSolver::solve: TSSolve failed. [ERROR] PetscSolver::solve: TSSolve failed. Aborting. Aborting. Thanks for the help, Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Tue Nov 15 12:03:42 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Tue, 15 Nov 2022 12:03:42 -0600 Subject: [petsc-users] [EXTERNAL] Re: Kokkos backend for Mat and Vec diverging when running on CUDA device. In-Reply-To: References: Message-ID: Can you paste -log_view result so I can see what functions are used? --Junchao Zhang On Tue, Nov 15, 2022 at 10:24 AM Fackler, Philip wrote: > Yes, most (but not all) of our system test cases fail with the kokkos/cuda > or cuda backends. All of them pass with the CPU-only kokkos backend. > > > *Philip Fackler * > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > *Oak Ridge National Laboratory* > ------------------------------ > *From:* Junchao Zhang > *Sent:* Monday, November 14, 2022 19:34 > *To:* Fackler, Philip > *Cc:* xolotl-psi-development at lists.sourceforge.net < > xolotl-psi-development at lists.sourceforge.net>; petsc-users at mcs.anl.gov < > petsc-users at mcs.anl.gov>; Blondel, Sophie ; Zhang, > Junchao ; Roth, Philip > *Subject:* [EXTERNAL] Re: [petsc-users] Kokkos backend for Mat and Vec > diverging when running on CUDA device. > > Hi, Philip, > Sorry to hear that. It seems you could run the same code on CPUs but > not no GPUs (with either petsc/Kokkos backend or petsc/cuda backend, is it > right? > > --Junchao Zhang > > > On Mon, Nov 14, 2022 at 12:13 PM Fackler, Philip via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > This is an issue I've brought up before (and discussed in-person with > Richard). I wanted to bring it up again because I'm hitting the limits of > what I know to do, and I need help figuring this out. > > The problem can be reproduced using Xolotl's "develop" branch built > against a petsc build with kokkos and kokkos-kernels enabled. Then, either > add the relevant kokkos options to the "petscArgs=" line in the system test > parameter file(s), or just replace the system test parameter files with the > ones from the "feature-petsc-kokkos" branch. See here the files that > begin with "params_system_". > > Note that those files use the "kokkos" options, but the problem is similar > using the corresponding cuda/cusparse options. I've already tried building > kokkos-kernels with no TPLs and got slightly different results, but the > same problem. > > Any help would be appreciated. > > Thanks, > > > *Philip Fackler * > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > *Oak Ridge National Laboratory* > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Tue Nov 15 12:20:23 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Tue, 15 Nov 2022 19:20:23 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: <59032746-0A26-40CE-BCE1-FF74932B27EA@petsc.dev> Message-ID: Hi Guys, Very quick one. Do I need to set the block size with MPIAIJ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Tue Nov 15 12:25:40 2022 From: jed at jedbrown.org (Jed Brown) Date: Tue, 15 Nov 2022 11:25:40 -0700 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: <59032746-0A26-40CE-BCE1-FF74932B27EA@petsc.dev> Message-ID: <875yfg9isb.fsf@jedbrown.org> You do if preconditioners (like AMG) will use it or if using functions like MatSetValuesBlocked(). If you have uniform block structure, it doesn't hurt. Edoardo alinovi writes: > Hi Guys, > > Very quick one. Do I need to set the block size with MPIAIJ? From edoardo.alinovi at gmail.com Tue Nov 15 12:26:31 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Tue, 15 Nov 2022 19:26:31 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: <875yfg9isb.fsf@jedbrown.org> References: <59032746-0A26-40CE-BCE1-FF74932B27EA@petsc.dev> <875yfg9isb.fsf@jedbrown.org> Message-ID: Thanks, I'll do it then :) Il Mar 15 Nov 2022, 19:25 Jed Brown ha scritto: > You do if preconditioners (like AMG) will use it or if using functions > like MatSetValuesBlocked(). If you have uniform block structure, it doesn't > hurt. > > Edoardo alinovi writes: > > > Hi Guys, > > > > Very quick one. Do I need to set the block size with MPIAIJ? > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Tue Nov 15 12:49:24 2022 From: mfadams at lbl.gov (Mark Adams) Date: Tue, 15 Nov 2022 13:49:24 -0500 Subject: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases In-Reply-To: References: Message-ID: Junchao, this is the same problem that I have been having right? On Tue, Nov 15, 2022 at 11:56 AM Fackler, Philip via petsc-users < petsc-users at mcs.anl.gov> wrote: > I built petsc with: > > $ ./configure PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug > --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-debugging=0 > --prefix=$HOME/build/petsc/debug/install --with-64-bit-indices > --with-shared-libraries --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --download-kokkos > --download-kokkos-kernels > > $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug all > > $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug install > > > Then I build xolotl in a separate build directory (after checking out the > "feature-petsc-kokkos" branch) with: > > $ cmake -DCMAKE_BUILD_TYPE=Debug > -DKokkos_DIR=$HOME/build/petsc/debug/install > -DPETSC_DIR=$HOME/build/petsc/debug/install > > $ make -j4 SystemTester > > > Then, from the xolotl build directory, run (for example): > > $ mpirun -n 2 ./test/system/SystemTester -t System/NE_4 -- -v > > Note that this test case will use the parameter file > '/benchmarks/params_system_NE_4.txt' which has the command-line > arguments for petsc in its "petscArgs=..." line. If you look at > '/test/system/SystemTester.cpp' all the system test cases > follow the same naming convention with their corresponding parameter files > under '/benchmarks'. > > The failure happens with the NE_4 case (which is 2D) and the PSI_3 case > (which is 1D). > > Let me know if this is still unclear. > > Thanks, > > > *Philip Fackler * > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > *Oak Ridge National Laboratory* > ------------------------------ > *From:* Junchao Zhang > *Sent:* Tuesday, November 15, 2022 00:16 > *To:* Fackler, Philip > *Cc:* petsc-users at mcs.anl.gov ; Blondel, Sophie < > sblondel at utk.edu> > *Subject:* [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO > interface crashes in some cases > > Hi, Philip, > Can you tell me instructions to build Xolotl to reproduce the error? > --Junchao Zhang > > > On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use > the COO interface for preallocating and setting values in the Jacobian > matrix. I have found that with some of our test cases, using more than one > MPI rank results in a crash. Way down in the preconditioner code in petsc a > Mat gets computed that has "null" for the "productsymbolic" member of its > "ops". It's pretty far removed from where we compute the Jacobian entries, > so I haven't been able (so far) to track it back to an error in my code. > I'd appreciate some help with this from someone who is more familiar with > the petsc guts so we can figure out what I'm doing wrong. (I'm assuming > it's a bug in Xolotl.) > > Note that this is using the kokkos backend for Mat and Vec in petsc, but > with a serial-only build of kokkos and kokkos-kernels. So, it's a CPU-only > multiple MPI rank run. > > Here's a paste of the error output showing the relevant parts of the call > stack: > > [ERROR] [0]PETSC ERROR: > [ERROR] --------------------- Error Message > -------------------------------------------------------------- > [ERROR] [1]PETSC ERROR: > [ERROR] --------------------- Error Message > -------------------------------------------------------------- > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] No support for this operation for this object type > [ERROR] [1]PETSC ERROR: > [ERROR] No support for this operation for this object type > [ERROR] [0]PETSC ERROR: > [ERROR] No method productsymbolic for Mat of type (null) > [ERROR] No method productsymbolic for Mat of type (null) > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. > [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT > Date: 2022-10-28 14:39:41 +0000 > [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT > Date: 2022-10-28 14:39:41 +0000 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 > [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc > PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc > --with-cxx=mpicxx --with-fc=0 --with-cudac=0 > --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices > --with-shared-libraries > --with-kokkos-dir=/home/4pf/build/kokkos/serial/install > --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install > [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc > PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc > --with-cxx=mpicxx --with-fc=0 --with-cudac=0 > --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices > --with-shared-libraries > --with-kokkos-dir=/home/4pf/build/kokkos/serial/install > --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 > [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 > [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #3 MatProductSymbolic() at > /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 > [ERROR] #3 MatProductSymbolic() at > /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #4 MatProduct_Private() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 > [ERROR] #4 MatProduct_Private() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #5 MatMatMult() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 > [ERROR] #5 MatMatMult() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #6 PCGAMGOptProlongator_AGG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 > [ERROR] #6 PCGAMGOptProlongator_AGG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #7 PCSetUp_GAMG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 > [ERROR] #7 PCSetUp_GAMG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #8 PCSetUp() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 > [ERROR] #8 PCSetUp() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #9 KSPSetUp() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 > [ERROR] #9 KSPSetUp() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #10 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 > [ERROR] #10 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #11 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] #11 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #12 PCApply_FieldSplit() at > /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 > [ERROR] #12 PCApply_FieldSplit() at > /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #13 PCApply() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 > [ERROR] #13 PCApply() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #14 KSP_PCApply() at > /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 > [ERROR] #14 KSP_PCApply() at > /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #15 KSPFGMRESCycle() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 > [ERROR] #15 KSPFGMRESCycle() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #16 KSPSolve_FGMRES() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 > [ERROR] #16 KSPSolve_FGMRES() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #17 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 > [ERROR] #17 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #18 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] #18 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #19 SNESSolve_NEWTONLS() at > /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 > [ERROR] #19 SNESSolve_NEWTONLS() at > /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #20 SNESSolve() at > /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 > [ERROR] #20 SNESSolve() at > /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #21 TSStep_ARKIMEX() at > /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 > [ERROR] #21 TSStep_ARKIMEX() at > /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 > [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 > [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 > [ERROR] PetscSolver::solve: TSSolve failed. > [ERROR] PetscSolver::solve: TSSolve failed. > Aborting. > Aborting. > > > > Thanks for the help, > > > *Philip Fackler * > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > *Oak Ridge National Laboratory* > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Tue Nov 15 14:42:17 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Tue, 15 Nov 2022 14:42:17 -0600 Subject: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases In-Reply-To: References: Message-ID: Mark, Do you have a reproducer using petsc examples? On Tue, Nov 15, 2022, 12:49 PM Mark Adams wrote: > Junchao, this is the same problem that I have been having right? > > On Tue, Nov 15, 2022 at 11:56 AM Fackler, Philip via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> I built petsc with: >> >> $ ./configure PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug >> --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-debugging=0 >> --prefix=$HOME/build/petsc/debug/install --with-64-bit-indices >> --with-shared-libraries --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --download-kokkos >> --download-kokkos-kernels >> >> $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug all >> >> $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug install >> >> >> Then I build xolotl in a separate build directory (after checking out the >> "feature-petsc-kokkos" branch) with: >> >> $ cmake -DCMAKE_BUILD_TYPE=Debug >> -DKokkos_DIR=$HOME/build/petsc/debug/install >> -DPETSC_DIR=$HOME/build/petsc/debug/install >> >> $ make -j4 SystemTester >> >> >> Then, from the xolotl build directory, run (for example): >> >> $ mpirun -n 2 ./test/system/SystemTester -t System/NE_4 -- -v >> >> Note that this test case will use the parameter file >> '/benchmarks/params_system_NE_4.txt' which has the command-line >> arguments for petsc in its "petscArgs=..." line. If you look at >> '/test/system/SystemTester.cpp' all the system test cases >> follow the same naming convention with their corresponding parameter files >> under '/benchmarks'. >> >> The failure happens with the NE_4 case (which is 2D) and the PSI_3 case >> (which is 1D). >> >> Let me know if this is still unclear. >> >> Thanks, >> >> >> *Philip Fackler * >> Research Software Engineer, Application Engineering Group >> Advanced Computing Systems Research Section >> Computer Science and Mathematics Division >> *Oak Ridge National Laboratory* >> ------------------------------ >> *From:* Junchao Zhang >> *Sent:* Tuesday, November 15, 2022 00:16 >> *To:* Fackler, Philip >> *Cc:* petsc-users at mcs.anl.gov ; Blondel, Sophie >> >> *Subject:* [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with >> COO interface crashes in some cases >> >> Hi, Philip, >> Can you tell me instructions to build Xolotl to reproduce the error? >> --Junchao Zhang >> >> >> On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> >> In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use >> the COO interface for preallocating and setting values in the Jacobian >> matrix. I have found that with some of our test cases, using more than one >> MPI rank results in a crash. Way down in the preconditioner code in petsc a >> Mat gets computed that has "null" for the "productsymbolic" member of its >> "ops". It's pretty far removed from where we compute the Jacobian entries, >> so I haven't been able (so far) to track it back to an error in my code. >> I'd appreciate some help with this from someone who is more familiar with >> the petsc guts so we can figure out what I'm doing wrong. (I'm assuming >> it's a bug in Xolotl.) >> >> Note that this is using the kokkos backend for Mat and Vec in petsc, but >> with a serial-only build of kokkos and kokkos-kernels. So, it's a CPU-only >> multiple MPI rank run. >> >> Here's a paste of the error output showing the relevant parts of the call >> stack: >> >> [ERROR] [0]PETSC ERROR: >> [ERROR] --------------------- Error Message >> -------------------------------------------------------------- >> [ERROR] [1]PETSC ERROR: >> [ERROR] --------------------- Error Message >> -------------------------------------------------------------- >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] No support for this operation for this object type >> [ERROR] [1]PETSC ERROR: >> [ERROR] No support for this operation for this object type >> [ERROR] [0]PETSC ERROR: >> [ERROR] No method productsymbolic for Mat of type (null) >> [ERROR] No method productsymbolic for Mat of type (null) >> [ERROR] [0]PETSC ERROR: >> [ERROR] [1]PETSC ERROR: >> [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. >> [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. >> [ERROR] [0]PETSC ERROR: >> [ERROR] [1]PETSC ERROR: >> [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT >> Date: 2022-10-28 14:39:41 +0000 >> [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT >> Date: 2022-10-28 14:39:41 +0000 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 >> [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc >> PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc >> --with-cxx=mpicxx --with-fc=0 --with-cudac=0 >> --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices >> --with-shared-libraries >> --with-kokkos-dir=/home/4pf/build/kokkos/serial/install >> --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install >> [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc >> PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc >> --with-cxx=mpicxx --with-fc=0 --with-cudac=0 >> --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices >> --with-shared-libraries >> --with-kokkos-dir=/home/4pf/build/kokkos/serial/install >> --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at >> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 >> [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at >> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at >> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 >> [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at >> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #3 MatProductSymbolic() at >> /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 >> [ERROR] #3 MatProductSymbolic() at >> /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #4 MatProduct_Private() at >> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 >> [ERROR] #4 MatProduct_Private() at >> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 >> [ERROR] [0]PETSC ERROR: >> [ERROR] [1]PETSC ERROR: >> [ERROR] #5 MatMatMult() at >> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 >> [ERROR] #5 MatMatMult() at >> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 >> [ERROR] [0]PETSC ERROR: >> [ERROR] [1]PETSC ERROR: >> [ERROR] #6 PCGAMGOptProlongator_AGG() at >> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 >> [ERROR] #6 PCGAMGOptProlongator_AGG() at >> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 >> [ERROR] [0]PETSC ERROR: >> [ERROR] [1]PETSC ERROR: >> [ERROR] #7 PCSetUp_GAMG() at >> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 >> [ERROR] #7 PCSetUp_GAMG() at >> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #8 PCSetUp() at >> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 >> [ERROR] #8 PCSetUp() at >> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #9 KSPSetUp() at >> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 >> [ERROR] #9 KSPSetUp() at >> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #10 KSPSolve_Private() at >> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 >> [ERROR] #10 KSPSolve_Private() at >> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 >> [ERROR] [0]PETSC ERROR: >> [ERROR] [1]PETSC ERROR: >> [ERROR] #11 KSPSolve() at >> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >> [ERROR] #11 KSPSolve() at >> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #12 PCApply_FieldSplit() at >> /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 >> [ERROR] #12 PCApply_FieldSplit() at >> /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #13 PCApply() at >> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 >> [ERROR] #13 PCApply() at >> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #14 KSP_PCApply() at >> /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 >> [ERROR] #14 KSP_PCApply() at >> /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #15 KSPFGMRESCycle() at >> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 >> [ERROR] #15 KSPFGMRESCycle() at >> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #16 KSPSolve_FGMRES() at >> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 >> [ERROR] #16 KSPSolve_FGMRES() at >> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #17 KSPSolve_Private() at >> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 >> [ERROR] #17 KSPSolve_Private() at >> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 >> [ERROR] [0]PETSC ERROR: >> [ERROR] [1]PETSC ERROR: >> [ERROR] #18 KSPSolve() at >> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >> [ERROR] #18 KSPSolve() at >> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >> [ERROR] [0]PETSC ERROR: >> [ERROR] [1]PETSC ERROR: >> [ERROR] #19 SNESSolve_NEWTONLS() at >> /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 >> [ERROR] #19 SNESSolve_NEWTONLS() at >> /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #20 SNESSolve() at >> /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 >> [ERROR] #20 SNESSolve() at >> /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #21 TSStep_ARKIMEX() at >> /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 >> [ERROR] #21 TSStep_ARKIMEX() at >> /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 >> [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 >> [ERROR] [1]PETSC ERROR: >> [ERROR] [0]PETSC ERROR: >> [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 >> [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 >> [ERROR] PetscSolver::solve: TSSolve failed. >> [ERROR] PetscSolver::solve: TSSolve failed. >> Aborting. >> Aborting. >> >> >> >> Thanks for the help, >> >> >> *Philip Fackler * >> Research Software Engineer, Application Engineering Group >> Advanced Computing Systems Research Section >> Computer Science and Mathematics Division >> *Oak Ridge National Laboratory* >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Wed Nov 16 07:05:39 2022 From: mfadams at lbl.gov (Mark Adams) Date: Wed, 16 Nov 2022 08:05:39 -0500 Subject: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases In-Reply-To: References: Message-ID: I can not build right now on Crusher or Perlmutter but I saw this on both. Here is an example output using src/snes/tests/ex13.c using the appended .petscrc This uses 64 processors and the 8 processor case worked. This has been semi-nondertminisitc for me. (and I have attached my current Perlmutter problem) Hope this helps, Mark -dm_plex_simplex 0 -dm_plex_dim 3 -dm_plex_box_lower 0,0,0 -dm_plex_box_upper 1,1,1 -petscpartitioner_simple_process_grid 2,2,2 -potential_petscspace_degree 2 -snes_max_it 1 -ksp_max_it 200 -ksp_type cg -ksp_rtol 1.e-12 -ksp_norm_type unpreconditioned -snes_rtol 1.e-8 #-pc_type gamg #-pc_gamg_type agg #-pc_gamg_agg_nsmooths 1 -pc_gamg_coarse_eq_limit 100 -pc_gamg_process_eq_limit 400 -pc_gamg_reuse_interpolation true #-snes_monitor #-ksp_monitor_short -ksp_converged_reason #-ksp_view #-snes_converged_reason #-mg_levels_ksp_max_it 2 -mg_levels_ksp_type chebyshev #-mg_levels_ksp_type richardson #-mg_levels_ksp_richardson_scale 0.8 -mg_levels_pc_type jacobi -pc_gamg_esteig_ksp_type cg -pc_gamg_esteig_ksp_max_it 10 -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 -dm_distribute -petscpartitioner_type simple -pc_gamg_repartition false -pc_gamg_coarse_grid_layout_type compact -pc_gamg_threshold 0.01 #-pc_gamg_threshold_scale .5 -pc_gamg_aggressive_coarsening 1 #-check_pointer_intensity 0 -snes_type ksponly #-mg_coarse_sub_pc_factor_mat_solver_type cusparse #-info :pc #-use_gpu_aware_mpi 1 -options_left #-malloc_debug -benchmark_it 10 #-pc_gamg_use_parallel_coarse_grid_solver #-mg_coarse_pc_type jacobi #-mg_coarse_ksp_type cg #-mg_coarse_ksp_rtol 1.e-2 #-mat_cusparse_transgen -snes_lag_jacobian -2 On Tue, Nov 15, 2022 at 3:42 PM Junchao Zhang wrote: > Mark, > Do you have a reproducer using petsc examples? > > On Tue, Nov 15, 2022, 12:49 PM Mark Adams wrote: > >> Junchao, this is the same problem that I have been having right? >> >> On Tue, Nov 15, 2022 at 11:56 AM Fackler, Philip via petsc-users < >> petsc-users at mcs.anl.gov> wrote: >> >>> I built petsc with: >>> >>> $ ./configure PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug >>> --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-debugging=0 >>> --prefix=$HOME/build/petsc/debug/install --with-64-bit-indices >>> --with-shared-libraries --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --download-kokkos >>> --download-kokkos-kernels >>> >>> $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug all >>> >>> $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug install >>> >>> >>> Then I build xolotl in a separate build directory (after checking out >>> the "feature-petsc-kokkos" branch) with: >>> >>> $ cmake -DCMAKE_BUILD_TYPE=Debug >>> -DKokkos_DIR=$HOME/build/petsc/debug/install >>> -DPETSC_DIR=$HOME/build/petsc/debug/install >>> >>> $ make -j4 SystemTester >>> >>> >>> Then, from the xolotl build directory, run (for example): >>> >>> $ mpirun -n 2 ./test/system/SystemTester -t System/NE_4 -- -v >>> >>> Note that this test case will use the parameter file >>> '/benchmarks/params_system_NE_4.txt' which has the command-line >>> arguments for petsc in its "petscArgs=..." line. If you look at >>> '/test/system/SystemTester.cpp' all the system test cases >>> follow the same naming convention with their corresponding parameter files >>> under '/benchmarks'. >>> >>> The failure happens with the NE_4 case (which is 2D) and the PSI_3 case >>> (which is 1D). >>> >>> Let me know if this is still unclear. >>> >>> Thanks, >>> >>> >>> *Philip Fackler * >>> Research Software Engineer, Application Engineering Group >>> Advanced Computing Systems Research Section >>> Computer Science and Mathematics Division >>> *Oak Ridge National Laboratory* >>> ------------------------------ >>> *From:* Junchao Zhang >>> *Sent:* Tuesday, November 15, 2022 00:16 >>> *To:* Fackler, Philip >>> *Cc:* petsc-users at mcs.anl.gov ; Blondel, >>> Sophie >>> *Subject:* [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with >>> COO interface crashes in some cases >>> >>> Hi, Philip, >>> Can you tell me instructions to build Xolotl to reproduce the error? >>> --Junchao Zhang >>> >>> >>> On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users < >>> petsc-users at mcs.anl.gov> wrote: >>> >>> In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use >>> the COO interface for preallocating and setting values in the Jacobian >>> matrix. I have found that with some of our test cases, using more than one >>> MPI rank results in a crash. Way down in the preconditioner code in petsc a >>> Mat gets computed that has "null" for the "productsymbolic" member of its >>> "ops". It's pretty far removed from where we compute the Jacobian entries, >>> so I haven't been able (so far) to track it back to an error in my code. >>> I'd appreciate some help with this from someone who is more familiar with >>> the petsc guts so we can figure out what I'm doing wrong. (I'm assuming >>> it's a bug in Xolotl.) >>> >>> Note that this is using the kokkos backend for Mat and Vec in petsc, but >>> with a serial-only build of kokkos and kokkos-kernels. So, it's a CPU-only >>> multiple MPI rank run. >>> >>> Here's a paste of the error output showing the relevant parts of the >>> call stack: >>> >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] --------------------- Error Message >>> -------------------------------------------------------------- >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] --------------------- Error Message >>> -------------------------------------------------------------- >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] No support for this operation for this object type >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] No support for this operation for this object type >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] No method productsymbolic for Mat of type (null) >>> [ERROR] No method productsymbolic for Mat of type (null) >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. >>> [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT >>> Date: 2022-10-28 14:39:41 +0000 >>> [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT >>> Date: 2022-10-28 14:39:41 +0000 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 >>> 2022 >>> [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 >>> 2022 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc >>> PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc >>> --with-cxx=mpicxx --with-fc=0 --with-cudac=0 >>> --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices >>> --with-shared-libraries >>> --with-kokkos-dir=/home/4pf/build/kokkos/serial/install >>> --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install >>> [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc >>> PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc >>> --with-cxx=mpicxx --with-fc=0 --with-cudac=0 >>> --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices >>> --with-shared-libraries >>> --with-kokkos-dir=/home/4pf/build/kokkos/serial/install >>> --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at >>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 >>> [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at >>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at >>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 >>> [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at >>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #3 MatProductSymbolic() at >>> /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 >>> [ERROR] #3 MatProductSymbolic() at >>> /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #4 MatProduct_Private() at >>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 >>> [ERROR] #4 MatProduct_Private() at >>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] #5 MatMatMult() at >>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 >>> [ERROR] #5 MatMatMult() at >>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] #6 PCGAMGOptProlongator_AGG() at >>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 >>> [ERROR] #6 PCGAMGOptProlongator_AGG() at >>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] #7 PCSetUp_GAMG() at >>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 >>> [ERROR] #7 PCSetUp_GAMG() at >>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #8 PCSetUp() at >>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 >>> [ERROR] #8 PCSetUp() at >>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #9 KSPSetUp() at >>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 >>> [ERROR] #9 KSPSetUp() at >>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #10 KSPSolve_Private() at >>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 >>> [ERROR] #10 KSPSolve_Private() at >>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] #11 KSPSolve() at >>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>> [ERROR] #11 KSPSolve() at >>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #12 PCApply_FieldSplit() at >>> /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 >>> [ERROR] #12 PCApply_FieldSplit() at >>> /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #13 PCApply() at >>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 >>> [ERROR] #13 PCApply() at >>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #14 KSP_PCApply() at >>> /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 >>> [ERROR] #14 KSP_PCApply() at >>> /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #15 KSPFGMRESCycle() at >>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 >>> [ERROR] #15 KSPFGMRESCycle() at >>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #16 KSPSolve_FGMRES() at >>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 >>> [ERROR] #16 KSPSolve_FGMRES() at >>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #17 KSPSolve_Private() at >>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 >>> [ERROR] #17 KSPSolve_Private() at >>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] #18 KSPSolve() at >>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>> [ERROR] #18 KSPSolve() at >>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] #19 SNESSolve_NEWTONLS() at >>> /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 >>> [ERROR] #19 SNESSolve_NEWTONLS() at >>> /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #20 SNESSolve() at >>> /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 >>> [ERROR] #20 SNESSolve() at >>> /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #21 TSStep_ARKIMEX() at >>> /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 >>> [ERROR] #21 TSStep_ARKIMEX() at >>> /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 >>> [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 >>> [ERROR] [1]PETSC ERROR: >>> [ERROR] [0]PETSC ERROR: >>> [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 >>> [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 >>> [ERROR] PetscSolver::solve: TSSolve failed. >>> [ERROR] PetscSolver::solve: TSSolve failed. >>> Aborting. >>> Aborting. >>> >>> >>> >>> Thanks for the help, >>> >>> >>> *Philip Fackler * >>> Research Software Engineer, Application Engineering Group >>> Advanced Computing Systems Research Section >>> Computer Science and Mathematics Division >>> *Oak Ridge National Laboratory* >>> >>> -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- + '[' -z '' ']' + case "$-" in + __lmod_vx=x + '[' -n x ']' + set +x Shell debugging temporarily silenced: export LMOD_SH_DBG_ON=1 for this output (/usr/share/lmod/lmod/init/bash) Shell debugging restarted + unset __lmod_vx + export SLURM_CPU_BIND=cores + SLURM_CPU_BIND=cores + EXTRA='-dm_view -log_view -use_gpu_aware_mpi 0' + KOKKOS_ARGS='-dm_mat_type aijkokkos -mat_type aijkokkos -dm_vec_type kokkos' + CUDA_ARGS='-dm_mat_type aijcusparse -mat_type aijcusparse -dm_vec_type cuda' + HYPRE_ARGS='-dm_mat_type hypre -dm_vec_type kokkos -pc_type hypre' + AMGX_ARGS='-pc_type amgx -pc_amgx_verbose true -pc_amgx_jacobi_relaxation_factor .7 -pc_amgx_aggressive_levels 2 -pc_amgx_print_grid_stats' + GAMG_ARGS='-pc_type gamg' + SRUN_ARGS='--cpu-bind=cores --ntasks-per-core=1 --gpu-bind=single:2' + NR=4 + N=1 + NG=4 + NC=2 + PG=2 + date Sun 30 Oct 2022 05:29:58 AM PDT + for REFINE in 3 + for NPIDX in 2 + let 'N1 = 2' + let 'N2 = 2 * 2' + let 'N4 = 4 * 2' + let 'REF1 = 3 + 1' + let 'NODES = 2 * 2 * 2' + let 'N = 8 * 2 * 4' + let 'NGPU = 8 * 4' + echo n= 64 ' NODES=' 8 n= 64 NODES= 8 ++ printf %03d 8 + foo=008 + srun -n64 -N8 -G 32 --cpu-bind=cores --ntasks-per-core=1 --gpu-bind=single:2 ../ex13 -dm_plex_box_faces 4,4,4 -petscpartitioner_simple_process_grid 2,2,2 -petscpartitioner_simple_node_grid 2,2,2 -dm_refine 4 -pc_type gamg -dm_mat_type aijkokkos -mat_type aijkokkos -dm_vec_type kokkos -dm_view -log_view -use_gpu_aware_mpi 0 ++ tee stdout.log ++ tee stderr.log DM Object: box 64 MPI processes type: plex box in 3 dimensions: Min/Max of 0-cells per rank: 4913/4913 Min/Max of 1-cells per rank: 13872/13872 Min/Max of 2-cells per rank: 13056/13056 Min/Max of 3-cells per rank: 4096/4096 Labels: celltype: 4 strata with value/size (0 (4913), 1 (13872), 4 (13056), 7 (4096)) depth: 4 strata with value/size (0 (4913), 1 (13872), 2 (13056), 3 (4096)) marker: 1 strata with value/size (1 (3169)) Face Sets: 3 strata with value/size (1 (961), 3 (961), 6 (961)) Number equations N = 2048383 [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: No support for this operation for this object type [0]PETSC ERROR: No method productsymbolic for Mat of type (null) [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [0]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [0]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [0]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [0]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [0]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001116 by madams Sun Oct 30 05:30:05 2022 [0]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [22]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [22]PETSC ERROR: No support for this operation for this object type [22]PETSC ERROR: No method productsymbolic for Mat of type (null) [22]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [22]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [22]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [22]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [22]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [22]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [22]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [38]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [38]PETSC ERROR: No support for this operation for this object type [38]PETSC ERROR: No method productsymbolic for Mat of type (null) [38]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [38]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [38]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [38]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [38]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [38]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [38]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [52]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [52]PETSC ERROR: No support for this operation for this object type [52]PETSC ERROR: No method productsymbolic for Mat of type (null) [52]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [52]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [52]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [52]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [52]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [52]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [52]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [62]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [62]PETSC ERROR: No support for this operation for this object type [62]PETSC ERROR: No method productsymbolic for Mat of type (null) [62]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [62]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [62]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [62]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [62]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [62]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [62]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [8]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [8]PETSC ERROR: No support for this operation for this object type [8]PETSC ERROR: No method productsymbolic for Mat of type (null) [8]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [8]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [8]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [8]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [8]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [8]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [8]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [0]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [0]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [0]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [0]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [0]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [0]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [0]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [0]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [0]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [22]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001120 by madams Sun Oct 30 05:30:05 2022 [22]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [22]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [38]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001124 by madams Sun Oct 30 05:30:05 2022 [38]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [8]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001117 by madams Sun Oct 30 05:30:05 2022 [8]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [52]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001128 by madams Sun Oct 30 05:30:05 2022 [62]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001129 by madams Sun Oct 30 05:30:05 2022 [62]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [62]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [0]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [0]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [25]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [25]PETSC ERROR: No support for this operation for this object type [25]PETSC ERROR: No method productsymbolic for Mat of type (null) [25]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [25]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [25]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [25]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [25]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [25]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [25]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [41]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [41]PETSC ERROR: No support for this operation for this object type [41]PETSC ERROR: No method productsymbolic for Mat of type (null) [41]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [41]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [41]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [41]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [41]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [41]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [41]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [0]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [0]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [0]PETSC ERROR: #14 main() at ex13.c:178 [0]PETSC ERROR: PETSc Option Table entries: [0]PETSC ERROR: -benchmark_it 10 [0]PETSC ERROR: -dm_distribute [0]PETSC ERROR: -dm_mat_type aijkokkos [0]PETSC ERROR: -dm_plex_box_faces 4,4,4 [0]PETSC ERROR: -dm_plex_box_lower 0,0,0 [0]PETSC ERROR: -dm_plex_box_upper 2,2,2 [0]PETSC ERROR: -dm_plex_dim 3 [0]PETSC ERROR: -dm_plex_simplex 0 [0]PETSC ERROR: -dm_refine 4 [0]PETSC ERROR: -dm_vec_type kokkos [0]PETSC ERROR: -dm_view [0]PETSC ERROR: -ksp_converged_reason [0]PETSC ERROR: -ksp_max_it 200 [0]PETSC ERROR: -ksp_norm_type unpreconditioned [0]PETSC ERROR: -ksp_rtol 1.e-12 [0]PETSC ERROR: -ksp_type cg [0]PETSC ERROR: -log_view [0]PETSC ERROR: -mat_type aijkokkos [0]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [0]PETSC ERROR: -mg_levels_ksp_type chebyshev [22]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [22]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [22]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [22]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [22]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [22]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [22]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [22]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [22]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [54]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [54]PETSC ERROR: No support for this operation for this object type [54]PETSC ERROR: No method productsymbolic for Mat of type (null) [54]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [54]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [54]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [54]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [54]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [54]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [54]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [0]PETSC ERROR: -mg_levels_pc_type jacobi [0]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [0]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [0]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [0]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [0]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [0]PETSC ERROR: -pc_gamg_process_eq_limit 400 [0]PETSC ERROR: -pc_gamg_repartition false [0]PETSC ERROR: -pc_gamg_reuse_interpolation true [0]PETSC ERROR: -pc_gamg_threshold 0.01 [0]PETSC ERROR: -pc_type gamg [0]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [54]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001128 by madams Sun Oct 30 05:30:05 2022 [54]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [54]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [1]PETSC ERROR: No support for this operation for this object type [1]PETSC ERROR: No method productsymbolic for Mat of type (null) [1]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [1]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [1]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [1]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [1]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [1]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [1]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001116 by madams Sun Oct 30 05:30:05 2022 [1]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [1]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [2]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [2]PETSC ERROR: No support for this operation for this object type [2]PETSC ERROR: No method productsymbolic for Mat of type (null) [2]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [2]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [2]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [2]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [2]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [2]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [2]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [2]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001116 by madams Sun Oct 30 05:30:05 2022 [2]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [2]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [41]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001125 by madams Sun Oct 30 05:30:05 2022 [8]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [8]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [8]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [8]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [8]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [8]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [8]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [8]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [8]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [2]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [38]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [38]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [38]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [38]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [38]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [38]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [38]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [38]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [38]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [43]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [43]PETSC ERROR: No support for this operation for this object type [43]PETSC ERROR: No method productsymbolic for Mat of type (null) [43]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [43]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [43]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [43]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [43]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [43]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [43]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [62]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [62]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [62]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [62]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [62]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [62]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [62]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [62]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [62]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [4]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [4]PETSC ERROR: No support for this operation for this object type [4]PETSC ERROR: No method productsymbolic for Mat of type (null) [4]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [4]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [4]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [4]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [4]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [4]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [4]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [38]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [38]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [38]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [38]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [38]PETSC ERROR: #14 main() at ex13.c:178 [62]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [62]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [62]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [62]PETSC ERROR: #14 main() at ex13.c:178 [62]PETSC ERROR: PETSc Option Table entries: [62]PETSC ERROR: -benchmark_it 10 [4]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001116 by madams Sun Oct 30 05:30:05 2022 [4]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [4]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [4]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [4]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [4]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [4]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [4]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [4]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [33]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [33]PETSC ERROR: No support for this operation for this object type [33]PETSC ERROR: No method productsymbolic for Mat of type (null) [33]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [33]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [33]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [33]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [8]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [8]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [8]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [6]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [6]PETSC ERROR: No support for this operation for this object type [6]PETSC ERROR: No method productsymbolic for Mat of type (null) [6]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [6]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [6]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [6]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [6]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [6]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [6]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [9]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [9]PETSC ERROR: No support for this operation for this object type [9]PETSC ERROR: No method productsymbolic for Mat of type (null) [9]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [9]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [9]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [9]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [9]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [9]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [6]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001116 by madams Sun Oct 30 05:30:05 2022 [6]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [6]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [43]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001125 by madams Sun Oct 30 05:30:05 2022 [43]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [43]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [54]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [54]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [54]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [54]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [54]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [54]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [54]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [44]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [44]PETSC ERROR: No support for this operation for this object type [44]PETSC ERROR: No method productsymbolic for Mat of type (null) [44]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [44]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [44]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [44]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [44]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [44]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [44]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [8]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [8]PETSC ERROR: #14 main() at ex13.c:178 [8]PETSC ERROR: PETSc Option Table entries: [8]PETSC ERROR: -benchmark_it 10 [8]PETSC ERROR: -dm_distribute [8]PETSC ERROR: -dm_mat_type aijkokkos [8]PETSC ERROR: -dm_plex_box_faces 4,4,4 [8]PETSC ERROR: -dm_plex_box_lower 0,0,0 [8]PETSC ERROR: -dm_plex_box_upper 2,2,2 [8]PETSC ERROR: -dm_plex_dim 3 [8]PETSC ERROR: -dm_plex_simplex 0 [8]PETSC ERROR: -dm_refine 4 [8]PETSC ERROR: -dm_vec_type kokkos [8]PETSC ERROR: -dm_view [8]PETSC ERROR: -ksp_converged_reason [8]PETSC ERROR: -ksp_max_it 200 [8]PETSC ERROR: -ksp_norm_type unpreconditioned [8]PETSC ERROR: -ksp_rtol 1.e-12 [8]PETSC ERROR: -ksp_type cg [8]PETSC ERROR: -log_view [8]PETSC ERROR: -mat_type aijkokkos [8]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [8]PETSC ERROR: -mg_levels_ksp_type chebyshev [8]PETSC ERROR: -mg_levels_pc_type jacobi [8]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [8]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [8]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [8]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [8]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [8]PETSC ERROR: -pc_gamg_process_eq_limit 400 [6]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [6]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [6]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [6]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [6]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [6]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [6]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [6]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [6]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [6]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [6]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [6]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [6]PETSC ERROR: #14 main() at ex13.c:178 [6]PETSC ERROR: PETSc Option Table entries: [0]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [0]PETSC ERROR: -petscpartitioner_type simple [0]PETSC ERROR: -potential_petscspace_degree 2 [0]PETSC ERROR: -snes_lag_jacobian -2 [0]PETSC ERROR: -snes_max_it 1 [0]PETSC ERROR: -snes_rtol 1.e-8 [0]PETSC ERROR: -snes_type ksponly [0]PETSC ERROR: -use_gpu_aware_mpi 0 [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 0] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001116] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [1]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [1]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [1]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [1]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [1]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [1]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [1]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [1]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [1]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [1]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [1]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [1]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [1]PETSC ERROR: #14 main() at ex13.c:178 [1]PETSC ERROR: PETSc Option Table entries: [1]PETSC ERROR: -benchmark_it 10 [1]PETSC ERROR: -dm_distribute [1]PETSC ERROR: -dm_mat_type aijkokkos [1]PETSC ERROR: -dm_plex_box_faces 4,4,4 [1]PETSC ERROR: -dm_plex_box_lower 0,0,0 [1]PETSC ERROR: -dm_plex_box_upper 2,2,2 [1]PETSC ERROR: -dm_plex_dim 3 [1]PETSC ERROR: -dm_plex_simplex 0 [1]PETSC ERROR: -dm_refine 4 [1]PETSC ERROR: -dm_vec_type kokkos [1]PETSC ERROR: -dm_view [1]PETSC ERROR: -ksp_converged_reason [1]PETSC ERROR: -ksp_max_it 200 [1]PETSC ERROR: -ksp_norm_type unpreconditioned [1]PETSC ERROR: -ksp_rtol 1.e-12 [1]PETSC ERROR: -ksp_type cg [1]PETSC ERROR: -log_view [1]PETSC ERROR: -mat_type aijkokkos [1]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [1]PETSC ERROR: -mg_levels_ksp_type chebyshev [1]PETSC ERROR: -mg_levels_pc_type jacobi [1]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [1]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [1]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [1]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [1]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [1]PETSC ERROR: -pc_gamg_process_eq_limit 400 [1]PETSC ERROR: -pc_gamg_repartition false [1]PETSC ERROR: -pc_gamg_reuse_interpolation true [1]PETSC ERROR: -pc_gamg_threshold 0.01 [1]PETSC ERROR: -pc_type gamg [1]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [1]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [1]PETSC ERROR: -petscpartitioner_type simple [1]PETSC ERROR: -potential_petscspace_degree 2 [1]PETSC ERROR: -snes_lag_jacobian -2 [1]PETSC ERROR: -snes_max_it 1 [1]PETSC ERROR: -snes_rtol 1.e-8 [1]PETSC ERROR: -snes_type ksponly [1]PETSC ERROR: -use_gpu_aware_mpi 0 [1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- [2]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [2]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [2]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [2]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [2]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [2]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [2]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [2]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [2]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [2]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [22]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [22]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [22]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [22]PETSC ERROR: #14 main() at ex13.c:178 [22]PETSC ERROR: PETSc Option Table entries: [22]PETSC ERROR: -benchmark_it 10 [22]PETSC ERROR: -dm_distribute [22]PETSC ERROR: -dm_mat_type aijkokkos [22]PETSC ERROR: -dm_plex_box_faces 4,4,4 [22]PETSC ERROR: -dm_plex_box_lower 0,0,0 [25]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001121 by madams Sun Oct 30 05:30:05 2022 [52]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [52]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [52]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [52]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [2]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [2]PETSC ERROR: #14 main() at ex13.c:178 [2]PETSC ERROR: PETSc Option Table entries: [2]PETSC ERROR: -benchmark_it 10 [2]PETSC ERROR: -dm_distribute [2]PETSC ERROR: -dm_mat_type aijkokkos [2]PETSC ERROR: -dm_plex_box_faces 4,4,4 [2]PETSC ERROR: -dm_plex_box_lower 0,0,0 [2]PETSC ERROR: -dm_plex_box_upper 2,2,2 [2]PETSC ERROR: -dm_plex_dim 3 [2]PETSC ERROR: -dm_plex_simplex 0 [2]PETSC ERROR: -dm_refine 4 [2]PETSC ERROR: -dm_vec_type kokkos [2]PETSC ERROR: -dm_view [2]PETSC ERROR: -ksp_converged_reason [2]PETSC ERROR: -ksp_max_it 200 [2]PETSC ERROR: -ksp_norm_type unpreconditioned [2]PETSC ERROR: -ksp_rtol 1.e-12 [2]PETSC ERROR: -ksp_type cg [2]PETSC ERROR: -log_view [2]PETSC ERROR: -mat_type aijkokkos [2]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [2]PETSC ERROR: -mg_levels_ksp_type chebyshev [2]PETSC ERROR: -mg_levels_pc_type jacobi [2]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [26]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [26]PETSC ERROR: No support for this operation for this object type [26]PETSC ERROR: No method productsymbolic for Mat of type (null) [26]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [26]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [26]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [26]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [26]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [26]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [26]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [52]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [52]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [52]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [52]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [52]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [52]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [52]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [52]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [52]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [52]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [44]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001125 by madams Sun Oct 30 05:30:05 2022 [44]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [44]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [2]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [2]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [2]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [2]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [2]PETSC ERROR: -pc_gamg_process_eq_limit 400 [2]PETSC ERROR: -pc_gamg_repartition false [2]PETSC ERROR: -pc_gamg_reuse_interpolation true [2]PETSC ERROR: -pc_gamg_threshold 0.01 [2]PETSC ERROR: -pc_type gamg [2]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [2]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [2]PETSC ERROR: -petscpartitioner_type simple [2]PETSC ERROR: -potential_petscspace_degree 2 [2]PETSC ERROR: -snes_lag_jacobian -2 [2]PETSC ERROR: -snes_max_it 1 [2]PETSC ERROR: -snes_rtol 1.e-8 [2]PETSC ERROR: -snes_type ksponly [2]PETSC ERROR: -use_gpu_aware_mpi 0 [2]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- [52]PETSC ERROR: #14 main() at ex13.c:178 [52]PETSC ERROR: PETSc Option Table entries: [52]PETSC ERROR: -benchmark_it 10 [52]PETSC ERROR: -dm_distribute [52]PETSC ERROR: -dm_mat_type aijkokkos [4]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [4]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [4]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [4]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [4]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [4]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [4]PETSC ERROR: #14 main() at ex13.c:178 [4]PETSC ERROR: PETSc Option Table entries: [4]PETSC ERROR: -benchmark_it 10 [4]PETSC ERROR: -dm_distribute [4]PETSC ERROR: -dm_mat_type aijkokkos [4]PETSC ERROR: -dm_plex_box_faces 4,4,4 [4]PETSC ERROR: -dm_plex_box_lower 0,0,0 [4]PETSC ERROR: -dm_plex_box_upper 2,2,2 [4]PETSC ERROR: -dm_plex_dim 3 [4]PETSC ERROR: -dm_plex_simplex 0 [4]PETSC ERROR: -dm_refine 4 [54]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [54]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [54]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [54]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [54]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [54]PETSC ERROR: #14 main() at ex13.c:178 [54]PETSC ERROR: PETSc Option Table entries: [54]PETSC ERROR: -benchmark_it 10 [54]PETSC ERROR: -dm_distribute [54]PETSC ERROR: -dm_mat_type aijkokkos [54]PETSC ERROR: -dm_plex_box_faces 4,4,4 [54]PETSC ERROR: -dm_plex_box_lower 0,0,0 [54]PETSC ERROR: -dm_plex_box_upper 2,2,2 [54]PETSC ERROR: -dm_plex_dim 3 [54]PETSC ERROR: -dm_plex_simplex 0 [54]PETSC ERROR: -dm_refine 4 [54]PETSC ERROR: -dm_vec_type kokkos [54]PETSC ERROR: -dm_view [54]PETSC ERROR: -ksp_converged_reason [4]PETSC ERROR: -dm_vec_type kokkos [4]PETSC ERROR: -dm_view [4]PETSC ERROR: -ksp_converged_reason [4]PETSC ERROR: -ksp_max_it 200 [4]PETSC ERROR: -ksp_norm_type unpreconditioned [4]PETSC ERROR: -ksp_rtol 1.e-12 [4]PETSC ERROR: -ksp_type cg [4]PETSC ERROR: -log_view [4]PETSC ERROR: -mat_type aijkokkos [4]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [4]PETSC ERROR: -mg_levels_ksp_type chebyshev [4]PETSC ERROR: -mg_levels_pc_type jacobi [4]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [4]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [4]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [4]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [4]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [4]PETSC ERROR: -pc_gamg_process_eq_limit 400 [4]PETSC ERROR: -pc_gamg_repartition false [4]PETSC ERROR: -pc_gamg_reuse_interpolation true [4]PETSC ERROR: -pc_gamg_threshold 0.01 [4]PETSC ERROR: -pc_type gamg [4]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [4]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [54]PETSC ERROR: -ksp_max_it 200 [54]PETSC ERROR: -ksp_norm_type unpreconditioned [54]PETSC ERROR: -ksp_rtol 1.e-12 [54]PETSC ERROR: -ksp_type cg [4]PETSC ERROR: -petscpartitioner_type simple [4]PETSC ERROR: -potential_petscspace_degree 2 [4]PETSC ERROR: -snes_lag_jacobian -2 [4]PETSC ERROR: -snes_max_it 1 [4]PETSC ERROR: -snes_rtol 1.e-8 [4]PETSC ERROR: -snes_type ksponly [4]PETSC ERROR: -use_gpu_aware_mpi 0 [4]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- [6]PETSC ERROR: -benchmark_it 10 [6]PETSC ERROR: -dm_distribute [6]PETSC ERROR: -dm_mat_type aijkokkos [6]PETSC ERROR: -dm_plex_box_faces 4,4,4 [6]PETSC ERROR: -dm_plex_box_lower 0,0,0 [6]PETSC ERROR: -dm_plex_box_upper 2,2,2 [6]PETSC ERROR: -dm_plex_dim 3 [6]PETSC ERROR: -dm_plex_simplex 0 [6]PETSC ERROR: -dm_refine 4 [6]PETSC ERROR: -dm_vec_type kokkos [6]PETSC ERROR: -dm_view [6]PETSC ERROR: -ksp_converged_reason [6]PETSC ERROR: -ksp_max_it 200 [6]PETSC ERROR: -ksp_norm_type unpreconditioned [6]PETSC ERROR: -ksp_rtol 1.e-12 [6]PETSC ERROR: -ksp_type cg [6]PETSC ERROR: -log_view [6]PETSC ERROR: -mat_type aijkokkos [6]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [6]PETSC ERROR: -mg_levels_ksp_type chebyshev [6]PETSC ERROR: -mg_levels_pc_type jacobi [6]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [6]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [6]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [6]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [6]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [6]PETSC ERROR: -pc_gamg_process_eq_limit 400 [6]PETSC ERROR: -pc_gamg_repartition false [6]PETSC ERROR: -pc_gamg_reuse_interpolation true [6]PETSC ERROR: -pc_gamg_threshold 0.01 [6]PETSC ERROR: -pc_type gamg [6]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [6]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [6]PETSC ERROR: -petscpartitioner_type simple [6]PETSC ERROR: -potential_petscspace_degree 2 [6]PETSC ERROR: -snes_lag_jacobian -2 [6]PETSC ERROR: -snes_max_it 1 [6]PETSC ERROR: -snes_rtol 1.e-8 [6]PETSC ERROR: -snes_type ksponly [6]PETSC ERROR: -use_gpu_aware_mpi 0 [6]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 6] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001116] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() MPICH ERROR [Rank 1] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001116] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() MPICH ERROR [Rank 2] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001116] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [3]PETSC ERROR: No support for this operation for this object type [3]PETSC ERROR: No method productsymbolic for Mat of type (null) [3]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [3]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [3]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [3]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [3]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [3]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [3]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [3]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001116 by madams Sun Oct 30 05:30:05 2022 [3]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [3]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [3]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [3]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [3]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [3]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [3]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [3]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [3]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [3]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [3]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [3]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [3]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [3]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [3]PETSC ERROR: #14 main() at ex13.c:178 [3]PETSC ERROR: PETSc Option Table entries: [3]PETSC ERROR: -benchmark_it 10 [3]PETSC ERROR: -dm_distribute [3]PETSC ERROR: -dm_mat_type aijkokkos [3]PETSC ERROR: -dm_plex_box_faces 4,4,4 [3]PETSC ERROR: -dm_plex_box_lower 0,0,0 [3]PETSC ERROR: -dm_plex_box_upper 2,2,2 [3]PETSC ERROR: -dm_plex_dim 3 [3]PETSC ERROR: -dm_plex_simplex 0 [3]PETSC ERROR: -dm_refine 4 [3]PETSC ERROR: -dm_vec_type kokkos [3]PETSC ERROR: -dm_view [3]PETSC ERROR: -ksp_converged_reason [57]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [57]PETSC ERROR: No support for this operation for this object type [57]PETSC ERROR: No method productsymbolic for Mat of type (null) [57]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [57]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [57]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [57]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [57]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [57]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [57]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [9]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [9]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001117 by madams Sun Oct 30 05:30:05 2022 [9]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [9]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 MPICH ERROR [Rank 4] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001116] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [9]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [9]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [9]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [9]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [9]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [9]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [9]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [9]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [9]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [9]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [5]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [5]PETSC ERROR: No support for this operation for this object type [5]PETSC ERROR: No method productsymbolic for Mat of type (null) [5]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [5]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [5]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [5]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [5]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [5]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [5]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [9]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [9]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [5]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001116 by madams Sun Oct 30 05:30:05 2022 [5]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [5]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [23]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [23]PETSC ERROR: No support for this operation for this object type [23]PETSC ERROR: No method productsymbolic for Mat of type (null) [23]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [23]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [23]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [23]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [23]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [23]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [23]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [35]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [35]PETSC ERROR: No support for this operation for this object type [35]PETSC ERROR: No method productsymbolic for Mat of type (null) [35]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [35]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [35]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [35]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [35]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [35]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [35]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [10]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [10]PETSC ERROR: No support for this operation for this object type [10]PETSC ERROR: No method productsymbolic for Mat of type (null) [10]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [10]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [10]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [10]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [10]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [10]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [10]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [5]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [5]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [5]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [5]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [5]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [5]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [5]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [5]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [5]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [5]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [17]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [17]PETSC ERROR: No support for this operation for this object type [17]PETSC ERROR: No method productsymbolic for Mat of type (null) [17]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [17]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [17]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [17]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [17]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [17]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [17]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [35]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001124 by madams Sun Oct 30 05:30:05 2022 [10]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001117 by madams Sun Oct 30 05:30:05 2022 [10]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [10]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [5]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [5]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [5]PETSC ERROR: #14 main() at ex13.c:178 [5]PETSC ERROR: PETSc Option Table entries: [5]PETSC ERROR: -benchmark_it 10 [5]PETSC ERROR: -dm_distribute [5]PETSC ERROR: -dm_mat_type aijkokkos [5]PETSC ERROR: -dm_plex_box_faces 4,4,4 [5]PETSC ERROR: -dm_plex_box_lower 0,0,0 [5]PETSC ERROR: -dm_plex_box_upper 2,2,2 [5]PETSC ERROR: -dm_plex_dim 3 [5]PETSC ERROR: -dm_plex_simplex 0 [5]PETSC ERROR: -dm_refine 4 [5]PETSC ERROR: -dm_vec_type kokkos [5]PETSC ERROR: -dm_view [5]PETSC ERROR: -ksp_converged_reason [5]PETSC ERROR: -ksp_max_it 200 [5]PETSC ERROR: -ksp_norm_type unpreconditioned [5]PETSC ERROR: -ksp_rtol 1.e-12 [5]PETSC ERROR: -ksp_type cg [5]PETSC ERROR: -log_view [5]PETSC ERROR: -mat_type aijkokkos [5]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [5]PETSC ERROR: -mg_levels_ksp_type chebyshev [36]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [36]PETSC ERROR: No support for this operation for this object type [36]PETSC ERROR: No method productsymbolic for Mat of type (null) [36]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [36]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [36]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [36]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [36]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [36]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [36]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [10]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [10]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [10]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [10]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [10]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [10]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [10]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [10]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [10]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [5]PETSC ERROR: -mg_levels_pc_type jacobi [5]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [5]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [5]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [5]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [5]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [36]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001124 by madams Sun Oct 30 05:30:05 2022 [36]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [36]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [13]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [13]PETSC ERROR: No support for this operation for this object type [13]PETSC ERROR: No method productsymbolic for Mat of type (null) [13]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [13]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [13]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [13]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [13]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [13]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [13]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [7]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [7]PETSC ERROR: No support for this operation for this object type [7]PETSC ERROR: No method productsymbolic for Mat of type (null) [7]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [7]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [7]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [7]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [7]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [7]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [7]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [7]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001116 by madams Sun Oct 30 05:30:05 2022 [7]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [7]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [7]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [7]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [7]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [7]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [7]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [7]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [7]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [7]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [7]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [7]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [7]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [7]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [7]PETSC ERROR: #14 main() at ex13.c:178 [7]PETSC ERROR: PETSc Option Table entries: [7]PETSC ERROR: -benchmark_it 10 [7]PETSC ERROR: -dm_distribute [7]PETSC ERROR: -dm_mat_type aijkokkos [7]PETSC ERROR: -dm_plex_box_faces 4,4,4 [7]PETSC ERROR: -dm_plex_box_lower 0,0,0 [7]PETSC ERROR: -dm_plex_box_upper 2,2,2 [7]PETSC ERROR: -dm_plex_dim 3 [7]PETSC ERROR: -dm_plex_simplex 0 [7]PETSC ERROR: -dm_refine 4 [7]PETSC ERROR: -dm_vec_type kokkos [7]PETSC ERROR: -dm_view [7]PETSC ERROR: -ksp_converged_reason [7]PETSC ERROR: -ksp_max_it 200 [7]PETSC ERROR: -ksp_norm_type unpreconditioned [7]PETSC ERROR: -ksp_rtol 1.e-12 [7]PETSC ERROR: -ksp_type cg [7]PETSC ERROR: -log_view [7]PETSC ERROR: -mat_type aijkokkos [7]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [7]PETSC ERROR: -mg_levels_ksp_type chebyshev [26]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001121 by madams Sun Oct 30 05:30:05 2022 [26]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [26]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [49]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [49]PETSC ERROR: No support for this operation for this object type [49]PETSC ERROR: No method productsymbolic for Mat of type (null) [49]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [49]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [49]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [49]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [49]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [49]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [49]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [7]PETSC ERROR: -mg_levels_pc_type jacobi [7]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [7]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [7]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [7]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [7]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [26]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [26]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [26]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [26]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [26]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [26]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [26]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [26]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [3]PETSC ERROR: -ksp_max_it 200 [3]PETSC ERROR: -ksp_norm_type unpreconditioned [3]PETSC ERROR: -ksp_rtol 1.e-12 [3]PETSC ERROR: -ksp_type cg [3]PETSC ERROR: -log_view [3]PETSC ERROR: -mat_type aijkokkos [3]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [3]PETSC ERROR: -mg_levels_ksp_type chebyshev [3]PETSC ERROR: -mg_levels_pc_type jacobi [3]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [3]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [3]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [3]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [3]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [3]PETSC ERROR: -pc_gamg_process_eq_limit 400 [3]PETSC ERROR: -pc_gamg_repartition false [3]PETSC ERROR: -pc_gamg_reuse_interpolation true [3]PETSC ERROR: -pc_gamg_threshold 0.01 [3]PETSC ERROR: -pc_type gamg [3]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [3]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [3]PETSC ERROR: -petscpartitioner_type simple [3]PETSC ERROR: -potential_petscspace_degree 2 [3]PETSC ERROR: -snes_lag_jacobian -2 [3]PETSC ERROR: -snes_max_it 1 [3]PETSC ERROR: -snes_rtol 1.e-8 [3]PETSC ERROR: -snes_type ksponly [3]PETSC ERROR: -use_gpu_aware_mpi 0 [3]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 3] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001116] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [5]PETSC ERROR: -pc_gamg_process_eq_limit 400 [5]PETSC ERROR: -pc_gamg_repartition false [5]PETSC ERROR: -pc_gamg_reuse_interpolation true [5]PETSC ERROR: -pc_gamg_threshold 0.01 [5]PETSC ERROR: -pc_type gamg [5]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [5]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [5]PETSC ERROR: -petscpartitioner_type simple [5]PETSC ERROR: -potential_petscspace_degree 2 [5]PETSC ERROR: -snes_lag_jacobian -2 [5]PETSC ERROR: -snes_max_it 1 [5]PETSC ERROR: -snes_rtol 1.e-8 [5]PETSC ERROR: -snes_type ksponly [5]PETSC ERROR: -use_gpu_aware_mpi 0 [5]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 5] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001116] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [7]PETSC ERROR: -pc_gamg_process_eq_limit 400 [7]PETSC ERROR: -pc_gamg_repartition false [7]PETSC ERROR: -pc_gamg_reuse_interpolation true [7]PETSC ERROR: -pc_gamg_threshold 0.01 [7]PETSC ERROR: -pc_type gamg [7]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [7]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [7]PETSC ERROR: -petscpartitioner_type simple [7]PETSC ERROR: -potential_petscspace_degree 2 [7]PETSC ERROR: -snes_lag_jacobian -2 [7]PETSC ERROR: -snes_max_it 1 [7]PETSC ERROR: -snes_rtol 1.e-8 [7]PETSC ERROR: -snes_type ksponly [7]PETSC ERROR: -use_gpu_aware_mpi 0 [7]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 7] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001116] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [59]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [59]PETSC ERROR: No support for this operation for this object type [59]PETSC ERROR: No method productsymbolic for Mat of type (null) [59]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [59]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [59]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [59]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [59]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [59]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [59]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [59]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001129 by madams Sun Oct 30 05:30:05 2022 [59]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [59]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [13]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001117 by madams Sun Oct 30 05:30:05 2022 [13]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [13]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [13]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [13]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [13]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [13]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [13]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [13]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [13]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [13]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [13]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [13]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [13]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [13]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [13]PETSC ERROR: #14 main() at ex13.c:178 [13]PETSC ERROR: PETSc Option Table entries: [13]PETSC ERROR: -benchmark_it 10 [13]PETSC ERROR: -dm_distribute [13]PETSC ERROR: -dm_mat_type aijkokkos [13]PETSC ERROR: -dm_plex_box_faces 4,4,4 [13]PETSC ERROR: -dm_plex_box_lower 0,0,0 [13]PETSC ERROR: -dm_plex_box_upper 2,2,2 [13]PETSC ERROR: -dm_plex_dim 3 [13]PETSC ERROR: -dm_plex_simplex 0 [13]PETSC ERROR: -dm_refine 4 [38]PETSC ERROR: PETSc Option Table entries: [38]PETSC ERROR: -benchmark_it 10 [38]PETSC ERROR: -dm_distribute [38]PETSC ERROR: -dm_mat_type aijkokkos [38]PETSC ERROR: -dm_plex_box_faces 4,4,4 [38]PETSC ERROR: -dm_plex_box_lower 0,0,0 [38]PETSC ERROR: -dm_plex_box_upper 2,2,2 [38]PETSC ERROR: -dm_plex_dim 3 [38]PETSC ERROR: -dm_plex_simplex 0 [38]PETSC ERROR: -dm_refine 4 [38]PETSC ERROR: -dm_vec_type kokkos [38]PETSC ERROR: -dm_view [38]PETSC ERROR: -ksp_converged_reason [38]PETSC ERROR: -ksp_max_it 200 [38]PETSC ERROR: -ksp_norm_type unpreconditioned [38]PETSC ERROR: -ksp_rtol 1.e-12 [38]PETSC ERROR: -ksp_type cg [38]PETSC ERROR: -log_view [38]PETSC ERROR: -mat_type aijkokkos [38]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [38]PETSC ERROR: -mg_levels_ksp_type chebyshev [38]PETSC ERROR: -mg_levels_pc_type jacobi [38]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [38]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [38]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [14]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [14]PETSC ERROR: No support for this operation for this object type [14]PETSC ERROR: No method productsymbolic for Mat of type (null) [14]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [14]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [14]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [14]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [14]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [14]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [14]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [38]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [38]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [38]PETSC ERROR: -pc_gamg_process_eq_limit 400 [38]PETSC ERROR: -pc_gamg_repartition false [38]PETSC ERROR: -pc_gamg_reuse_interpolation true [38]PETSC ERROR: -pc_gamg_threshold 0.01 [38]PETSC ERROR: -pc_type gamg [38]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [38]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [38]PETSC ERROR: -petscpartitioner_type simple [38]PETSC ERROR: -potential_petscspace_degree 2 [38]PETSC ERROR: -snes_lag_jacobian -2 [38]PETSC ERROR: -snes_max_it 1 [38]PETSC ERROR: -snes_rtol 1.e-8 [38]PETSC ERROR: -snes_type ksponly [38]PETSC ERROR: -use_gpu_aware_mpi 0 [38]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- [14]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001117 by madams Sun Oct 30 05:30:05 2022 [14]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [14]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [30]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [30]PETSC ERROR: No support for this operation for this object type [30]PETSC ERROR: No method productsymbolic for Mat of type (null) [30]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [30]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [30]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [30]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [30]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [30]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [30]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [30]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001121 by madams Sun Oct 30 05:30:05 2022 [30]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [30]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [49]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001128 by madams Sun Oct 30 05:30:05 2022 [49]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [49]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [44]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [44]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [44]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [44]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [44]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [44]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [44]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [44]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [44]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [30]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [30]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [30]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [30]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [30]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [30]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [30]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [30]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [30]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [51]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [51]PETSC ERROR: No support for this operation for this object type [51]PETSC ERROR: No method productsymbolic for Mat of type (null) [51]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [51]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [51]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [51]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [51]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [51]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [51]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [44]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [51]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001128 by madams Sun Oct 30 05:30:05 2022 [51]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [51]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [46]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [46]PETSC ERROR: No support for this operation for this object type [46]PETSC ERROR: No method productsymbolic for Mat of type (null) [46]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [46]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [46]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [46]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [46]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [46]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [46]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [52]PETSC ERROR: -dm_plex_box_faces 4,4,4 [52]PETSC ERROR: -dm_plex_box_lower 0,0,0 [52]PETSC ERROR: -dm_plex_box_upper 2,2,2 [52]PETSC ERROR: -dm_plex_dim 3 [52]PETSC ERROR: -dm_plex_simplex 0 [52]PETSC ERROR: -dm_refine 4 [52]PETSC ERROR: -dm_vec_type kokkos [52]PETSC ERROR: -dm_view [52]PETSC ERROR: -ksp_converged_reason [52]PETSC ERROR: -ksp_max_it 200 [52]PETSC ERROR: -ksp_norm_type unpreconditioned [52]PETSC ERROR: -ksp_rtol 1.e-12 [52]PETSC ERROR: -ksp_type cg [52]PETSC ERROR: -log_view [52]PETSC ERROR: -mat_type aijkokkos [52]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [52]PETSC ERROR: -mg_levels_ksp_type chebyshev [52]PETSC ERROR: -mg_levels_pc_type jacobi [52]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [52]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [52]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [52]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [52]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [52]PETSC ERROR: -pc_gamg_process_eq_limit 400 [46]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001125 by madams Sun Oct 30 05:30:05 2022 [46]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [46]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [61]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [61]PETSC ERROR: No support for this operation for this object type [61]PETSC ERROR: No method productsymbolic for Mat of type (null) [61]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [61]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [61]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [61]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [61]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [61]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [61]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [14]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [14]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [14]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [14]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [14]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [14]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [14]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [14]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [14]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [14]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [14]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [14]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [14]PETSC ERROR: #14 main() at ex13.c:178 [14]PETSC ERROR: PETSc Option Table entries: [14]PETSC ERROR: -benchmark_it 10 [14]PETSC ERROR: -dm_distribute [14]PETSC ERROR: -dm_mat_type aijkokkos [14]PETSC ERROR: -dm_plex_box_faces 4,4,4 [14]PETSC ERROR: -dm_plex_box_lower 0,0,0 [14]PETSC ERROR: -dm_plex_box_upper 2,2,2 [14]PETSC ERROR: -dm_plex_dim 3 [14]PETSC ERROR: -dm_plex_simplex 0 [14]PETSC ERROR: -dm_refine 4 [14]PETSC ERROR: -dm_vec_type kokkos [14]PETSC ERROR: -dm_view [14]PETSC ERROR: -ksp_converged_reason [14]PETSC ERROR: -ksp_max_it 200 [14]PETSC ERROR: -ksp_norm_type unpreconditioned [14]PETSC ERROR: -ksp_rtol 1.e-12 [14]PETSC ERROR: -ksp_type cg [14]PETSC ERROR: -log_view [14]PETSC ERROR: -mat_type aijkokkos [14]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [14]PETSC ERROR: -mg_levels_ksp_type chebyshev [14]PETSC ERROR: -mg_levels_pc_type jacobi [14]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [14]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [14]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [14]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [8]PETSC ERROR: -pc_gamg_repartition false [8]PETSC ERROR: -pc_gamg_reuse_interpolation true [8]PETSC ERROR: -pc_gamg_threshold 0.01 [8]PETSC ERROR: -pc_type gamg [8]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [8]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [8]PETSC ERROR: -petscpartitioner_type simple [8]PETSC ERROR: -potential_petscspace_degree 2 [8]PETSC ERROR: -snes_lag_jacobian -2 [8]PETSC ERROR: -snes_max_it 1 [8]PETSC ERROR: -snes_rtol 1.e-8 [8]PETSC ERROR: -snes_type ksponly [8]PETSC ERROR: -use_gpu_aware_mpi 0 [8]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 8] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001117] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [9]PETSC ERROR: #14 main() at ex13.c:178 [9]PETSC ERROR: PETSc Option Table entries: [9]PETSC ERROR: -benchmark_it 10 [9]PETSC ERROR: -dm_distribute [9]PETSC ERROR: -dm_mat_type aijkokkos [9]PETSC ERROR: -dm_plex_box_faces 4,4,4 [9]PETSC ERROR: -dm_plex_box_lower 0,0,0 [9]PETSC ERROR: -dm_plex_box_upper 2,2,2 [9]PETSC ERROR: -dm_plex_dim 3 [9]PETSC ERROR: -dm_plex_simplex 0 [9]PETSC ERROR: -dm_refine 4 [9]PETSC ERROR: -dm_vec_type kokkos [9]PETSC ERROR: -dm_view [9]PETSC ERROR: -ksp_converged_reason [9]PETSC ERROR: -ksp_max_it 200 [9]PETSC ERROR: -ksp_norm_type unpreconditioned [9]PETSC ERROR: -ksp_rtol 1.e-12 [9]PETSC ERROR: -ksp_type cg [9]PETSC ERROR: -log_view [9]PETSC ERROR: -mat_type aijkokkos [9]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [9]PETSC ERROR: -mg_levels_ksp_type chebyshev [9]PETSC ERROR: -mg_levels_pc_type jacobi [9]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [9]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [9]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [17]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001120 by madams Sun Oct 30 05:30:05 2022 [17]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [17]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [17]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [17]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [39]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [39]PETSC ERROR: No support for this operation for this object type [39]PETSC ERROR: No method productsymbolic for Mat of type (null) [39]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [39]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [39]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [39]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [39]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [39]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [39]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [19]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [19]PETSC ERROR: No support for this operation for this object type [19]PETSC ERROR: No method productsymbolic for Mat of type (null) [19]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [19]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [19]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [19]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [19]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [19]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [19]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [39]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001124 by madams Sun Oct 30 05:30:05 2022 [39]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [39]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [19]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001120 by madams Sun Oct 30 05:30:05 2022 [19]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [19]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [39]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [39]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [39]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [39]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [39]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [39]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [39]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [39]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [39]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [39]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [39]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [39]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [39]PETSC ERROR: #14 main() at ex13.c:178 [39]PETSC ERROR: PETSc Option Table entries: [39]PETSC ERROR: -benchmark_it 10 [39]PETSC ERROR: -dm_distribute [30]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [30]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [30]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [30]PETSC ERROR: #14 main() at ex13.c:178 [30]PETSC ERROR: PETSc Option Table entries: [30]PETSC ERROR: -benchmark_it 10 [52]PETSC ERROR: -pc_gamg_repartition false [52]PETSC ERROR: -pc_gamg_reuse_interpolation true [52]PETSC ERROR: -pc_gamg_threshold 0.01 [52]PETSC ERROR: -pc_type gamg [52]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [52]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [52]PETSC ERROR: -petscpartitioner_type simple [52]PETSC ERROR: -potential_petscspace_degree 2 [52]PETSC ERROR: -snes_lag_jacobian -2 [52]PETSC ERROR: -snes_max_it 1 [52]PETSC ERROR: -snes_rtol 1.e-8 [52]PETSC ERROR: -snes_type ksponly [52]PETSC ERROR: -use_gpu_aware_mpi 0 [52]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- [24]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [24]PETSC ERROR: No support for this operation for this object type [24]PETSC ERROR: No method productsymbolic for Mat of type (null) [24]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [24]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [24]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [24]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [24]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [24]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [24]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [61]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001129 by madams Sun Oct 30 05:30:05 2022 [61]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [61]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [9]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [9]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [9]PETSC ERROR: -pc_gamg_process_eq_limit 400 [9]PETSC ERROR: -pc_gamg_repartition false [9]PETSC ERROR: -pc_gamg_reuse_interpolation true [9]PETSC ERROR: -pc_gamg_threshold 0.01 [9]PETSC ERROR: -pc_type gamg [9]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [9]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [9]PETSC ERROR: -petscpartitioner_type simple [9]PETSC ERROR: -potential_petscspace_degree 2 [9]PETSC ERROR: -snes_lag_jacobian -2 [9]PETSC ERROR: -snes_max_it 1 [9]PETSC ERROR: -snes_rtol 1.e-8 [9]PETSC ERROR: -snes_type ksponly [9]PETSC ERROR: -use_gpu_aware_mpi 0 [9]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 9] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001117] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [61]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [61]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [10]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [10]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [10]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [10]PETSC ERROR: #14 main() at ex13.c:178 [10]PETSC ERROR: PETSc Option Table entries: [10]PETSC ERROR: -benchmark_it 10 [10]PETSC ERROR: -dm_distribute [10]PETSC ERROR: -dm_mat_type aijkokkos [10]PETSC ERROR: -dm_plex_box_faces 4,4,4 [10]PETSC ERROR: -dm_plex_box_lower 0,0,0 [10]PETSC ERROR: -dm_plex_box_upper 2,2,2 [10]PETSC ERROR: -dm_plex_dim 3 [10]PETSC ERROR: -dm_plex_simplex 0 [10]PETSC ERROR: -dm_refine 4 [10]PETSC ERROR: -dm_vec_type kokkos [10]PETSC ERROR: -dm_view [10]PETSC ERROR: -ksp_converged_reason [10]PETSC ERROR: -ksp_max_it 200 [10]PETSC ERROR: -ksp_norm_type unpreconditioned [10]PETSC ERROR: -ksp_rtol 1.e-12 [10]PETSC ERROR: -ksp_type cg [10]PETSC ERROR: -log_view [62]PETSC ERROR: -dm_distribute [62]PETSC ERROR: -dm_mat_type aijkokkos [62]PETSC ERROR: -dm_plex_box_faces 4,4,4 [62]PETSC ERROR: -dm_plex_box_lower 0,0,0 [62]PETSC ERROR: -dm_plex_box_upper 2,2,2 [62]PETSC ERROR: -dm_plex_dim 3 [62]PETSC ERROR: -dm_plex_simplex 0 [62]PETSC ERROR: -dm_refine 4 [62]PETSC ERROR: -dm_vec_type kokkos [62]PETSC ERROR: -dm_view [62]PETSC ERROR: -ksp_converged_reason [62]PETSC ERROR: -ksp_max_it 200 [62]PETSC ERROR: -ksp_norm_type unpreconditioned [62]PETSC ERROR: -ksp_rtol 1.e-12 [62]PETSC ERROR: -ksp_type cg [62]PETSC ERROR: -log_view [62]PETSC ERROR: -mat_type aijkokkos [62]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [62]PETSC ERROR: -mg_levels_ksp_type chebyshev [62]PETSC ERROR: -mg_levels_pc_type jacobi [62]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [62]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [62]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [62]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [62]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [10]PETSC ERROR: -mat_type aijkokkos [10]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [10]PETSC ERROR: -mg_levels_ksp_type chebyshev [10]PETSC ERROR: -mg_levels_pc_type jacobi [10]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [10]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [10]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [10]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [10]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [10]PETSC ERROR: -pc_gamg_process_eq_limit 400 [10]PETSC ERROR: -pc_gamg_repartition false [10]PETSC ERROR: -pc_gamg_reuse_interpolation true [10]PETSC ERROR: -pc_gamg_threshold 0.01 [10]PETSC ERROR: -pc_type gamg [10]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [10]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [10]PETSC ERROR: -petscpartitioner_type simple [10]PETSC ERROR: -potential_petscspace_degree 2 [10]PETSC ERROR: -snes_lag_jacobian -2 [10]PETSC ERROR: -snes_max_it 1 [10]PETSC ERROR: -snes_rtol 1.e-8 [10]PETSC ERROR: -snes_type ksponly [62]PETSC ERROR: -pc_gamg_process_eq_limit 400 [62]PETSC ERROR: -pc_gamg_repartition false [62]PETSC ERROR: -pc_gamg_reuse_interpolation true [62]PETSC ERROR: -pc_gamg_threshold 0.01 [62]PETSC ERROR: -pc_type gamg [62]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [62]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [62]PETSC ERROR: -petscpartitioner_type simple [62]PETSC ERROR: -potential_petscspace_degree 2 [62]PETSC ERROR: -snes_lag_jacobian -2 [62]PETSC ERROR: -snes_max_it 1 [62]PETSC ERROR: -snes_rtol 1.e-8 [62]PETSC ERROR: -snes_type ksponly [62]PETSC ERROR: -use_gpu_aware_mpi 0 [62]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 62] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001129] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [10]PETSC ERROR: -use_gpu_aware_mpi 0 [10]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 10] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001117] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [11]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [11]PETSC ERROR: No support for this operation for this object type [11]PETSC ERROR: No method productsymbolic for Mat of type (null) [11]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [11]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [11]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [11]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [11]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [11]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [11]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [19]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [19]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [19]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [19]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [19]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [19]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [19]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [19]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [19]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [11]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001117 by madams Sun Oct 30 05:30:05 2022 [11]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [11]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [19]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [19]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [19]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [19]PETSC ERROR: #14 main() at ex13.c:178 [11]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [11]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [11]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [11]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [11]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [11]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [11]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [11]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [11]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [20]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [20]PETSC ERROR: No support for this operation for this object type [20]PETSC ERROR: No method productsymbolic for Mat of type (null) [20]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [20]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [20]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [20]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [20]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [20]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [20]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [53]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [53]PETSC ERROR: No support for this operation for this object type [53]PETSC ERROR: No method productsymbolic for Mat of type (null) [53]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [53]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [53]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [53]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [53]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [53]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [53]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [46]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [46]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [46]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [46]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [46]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [46]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [46]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [46]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [46]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [11]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [11]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [11]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [11]PETSC ERROR: #14 main() at ex13.c:178 [11]PETSC ERROR: PETSc Option Table entries: [11]PETSC ERROR: -benchmark_it 10 [11]PETSC ERROR: -dm_distribute [11]PETSC ERROR: -dm_mat_type aijkokkos [11]PETSC ERROR: -dm_plex_box_faces 4,4,4 [53]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001128 by madams Sun Oct 30 05:30:05 2022 [53]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [53]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [46]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [12]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [12]PETSC ERROR: No support for this operation for this object type [12]PETSC ERROR: No method productsymbolic for Mat of type (null) [12]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [12]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [12]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [12]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [12]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [12]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [12]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [53]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [53]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [53]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [53]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [53]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [53]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [53]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [53]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [53]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [40]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [40]PETSC ERROR: No support for this operation for this object type [40]PETSC ERROR: No method productsymbolic for Mat of type (null) [40]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [40]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [40]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [40]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [40]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [40]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [40]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [12]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001117 by madams Sun Oct 30 05:30:05 2022 [12]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [12]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [53]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [53]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [53]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [53]PETSC ERROR: #14 main() at ex13.c:178 [53]PETSC ERROR: PETSc Option Table entries: [53]PETSC ERROR: -benchmark_it 10 [53]PETSC ERROR: -dm_distribute [53]PETSC ERROR: -dm_mat_type aijkokkos [53]PETSC ERROR: -dm_plex_box_faces 4,4,4 [53]PETSC ERROR: -dm_plex_box_lower 0,0,0 [53]PETSC ERROR: -dm_plex_box_upper 2,2,2 [53]PETSC ERROR: -dm_plex_dim 3 [53]PETSC ERROR: -dm_plex_simplex 0 [53]PETSC ERROR: -dm_refine 4 [53]PETSC ERROR: -dm_vec_type kokkos [53]PETSC ERROR: -dm_view [12]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [12]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [12]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [12]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [12]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [12]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [12]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [12]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [12]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [54]PETSC ERROR: -log_view [54]PETSC ERROR: -mat_type aijkokkos [54]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [54]PETSC ERROR: -mg_levels_ksp_type chebyshev [54]PETSC ERROR: -mg_levels_pc_type jacobi [54]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [54]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [54]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [54]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [54]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [54]PETSC ERROR: -pc_gamg_process_eq_limit 400 [54]PETSC ERROR: -pc_gamg_repartition false [54]PETSC ERROR: -pc_gamg_reuse_interpolation true [54]PETSC ERROR: -pc_gamg_threshold 0.01 [54]PETSC ERROR: -pc_type gamg [54]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [54]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [54]PETSC ERROR: -petscpartitioner_type simple [54]PETSC ERROR: -potential_petscspace_degree 2 [54]PETSC ERROR: -snes_lag_jacobian -2 [54]PETSC ERROR: -snes_max_it 1 [54]PETSC ERROR: -snes_rtol 1.e-8 [54]PETSC ERROR: -snes_type ksponly [54]PETSC ERROR: -use_gpu_aware_mpi 0 [54]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 54] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001128] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [55]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [55]PETSC ERROR: No support for this operation for this object type [55]PETSC ERROR: No method productsymbolic for Mat of type (null) [55]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [55]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [55]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [55]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [55]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [55]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [55]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [63]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [63]PETSC ERROR: No support for this operation for this object type [63]PETSC ERROR: No method productsymbolic for Mat of type (null) [63]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [63]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [63]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [63]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [63]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [63]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [63]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [63]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001129 by madams Sun Oct 30 05:30:05 2022 [63]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [63]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [12]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [12]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [12]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [12]PETSC ERROR: #14 main() at ex13.c:178 [12]PETSC ERROR: PETSc Option Table entries: [12]PETSC ERROR: -benchmark_it 10 [12]PETSC ERROR: -dm_distribute [12]PETSC ERROR: -dm_mat_type aijkokkos [12]PETSC ERROR: -dm_plex_box_faces 4,4,4 [12]PETSC ERROR: -dm_plex_box_lower 0,0,0 [12]PETSC ERROR: -dm_plex_box_upper 2,2,2 [12]PETSC ERROR: -dm_plex_dim 3 [12]PETSC ERROR: -dm_plex_simplex 0 [12]PETSC ERROR: -dm_refine 4 [12]PETSC ERROR: -dm_vec_type kokkos [12]PETSC ERROR: -dm_view [12]PETSC ERROR: -ksp_converged_reason [12]PETSC ERROR: -ksp_max_it 200 [12]PETSC ERROR: -ksp_norm_type unpreconditioned [12]PETSC ERROR: -ksp_rtol 1.e-12 [12]PETSC ERROR: -ksp_type cg [12]PETSC ERROR: -log_view [63]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [63]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [63]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [63]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [63]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [63]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [63]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [63]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [63]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [12]PETSC ERROR: -mat_type aijkokkos [63]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [63]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [63]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [63]PETSC ERROR: #14 main() at ex13.c:178 [63]PETSC ERROR: PETSc Option Table entries: [63]PETSC ERROR: -benchmark_it 10 [63]PETSC ERROR: -dm_distribute [63]PETSC ERROR: -dm_mat_type aijkokkos [63]PETSC ERROR: -dm_plex_box_faces 4,4,4 [63]PETSC ERROR: -dm_plex_box_lower 0,0,0 [63]PETSC ERROR: -dm_plex_box_upper 2,2,2 [63]PETSC ERROR: -dm_plex_dim 3 [63]PETSC ERROR: -dm_plex_simplex 0 [63]PETSC ERROR: -dm_refine 4 [63]PETSC ERROR: -dm_vec_type kokkos [63]PETSC ERROR: -dm_view [63]PETSC ERROR: -ksp_converged_reason [13]PETSC ERROR: -dm_vec_type kokkos [13]PETSC ERROR: -dm_view [13]PETSC ERROR: -ksp_converged_reason [13]PETSC ERROR: -ksp_max_it 200 [13]PETSC ERROR: -ksp_norm_type unpreconditioned [13]PETSC ERROR: -ksp_rtol 1.e-12 [13]PETSC ERROR: -ksp_type cg [13]PETSC ERROR: -log_view [13]PETSC ERROR: -mat_type aijkokkos [13]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [13]PETSC ERROR: -mg_levels_ksp_type chebyshev [13]PETSC ERROR: -mg_levels_pc_type jacobi [13]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [13]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [13]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [13]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [13]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [13]PETSC ERROR: -pc_gamg_process_eq_limit 400 [13]PETSC ERROR: -pc_gamg_repartition false [13]PETSC ERROR: -pc_gamg_reuse_interpolation true [13]PETSC ERROR: -pc_gamg_threshold 0.01 [13]PETSC ERROR: -pc_type gamg [13]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [57]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001129 by madams Sun Oct 30 05:30:05 2022 [57]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [57]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [13]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [13]PETSC ERROR: -petscpartitioner_type simple [13]PETSC ERROR: -potential_petscspace_degree 2 [13]PETSC ERROR: -snes_lag_jacobian -2 [13]PETSC ERROR: -snes_max_it 1 [13]PETSC ERROR: -snes_rtol 1.e-8 [13]PETSC ERROR: -snes_type ksponly [13]PETSC ERROR: -use_gpu_aware_mpi 0 [13]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 13] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001117] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [57]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [57]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [57]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [57]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [57]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [57]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [57]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [57]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [57]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [14]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [14]PETSC ERROR: -pc_gamg_process_eq_limit 400 [14]PETSC ERROR: -pc_gamg_repartition false [14]PETSC ERROR: -pc_gamg_reuse_interpolation true [14]PETSC ERROR: -pc_gamg_threshold 0.01 [14]PETSC ERROR: -pc_type gamg [14]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [14]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [14]PETSC ERROR: -petscpartitioner_type simple [14]PETSC ERROR: -potential_petscspace_degree 2 [14]PETSC ERROR: -snes_lag_jacobian -2 [14]PETSC ERROR: -snes_max_it 1 [14]PETSC ERROR: -snes_rtol 1.e-8 [14]PETSC ERROR: -snes_type ksponly [14]PETSC ERROR: -use_gpu_aware_mpi 0 [14]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 14] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001117] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [57]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [57]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [57]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [57]PETSC ERROR: #14 main() at ex13.c:178 [57]PETSC ERROR: PETSc Option Table entries: [57]PETSC ERROR: -benchmark_it 10 [57]PETSC ERROR: -dm_distribute [57]PETSC ERROR: -dm_mat_type aijkokkos [57]PETSC ERROR: -dm_plex_box_faces 4,4,4 [57]PETSC ERROR: -dm_plex_box_lower 0,0,0 [57]PETSC ERROR: -dm_plex_box_upper 2,2,2 [57]PETSC ERROR: -dm_plex_dim 3 [57]PETSC ERROR: -dm_plex_simplex 0 [57]PETSC ERROR: -dm_refine 4 [57]PETSC ERROR: -dm_vec_type kokkos [57]PETSC ERROR: -dm_view [57]PETSC ERROR: -ksp_converged_reason [57]PETSC ERROR: -ksp_max_it 200 [57]PETSC ERROR: -ksp_norm_type unpreconditioned [57]PETSC ERROR: -ksp_rtol 1.e-12 [57]PETSC ERROR: -ksp_type cg [57]PETSC ERROR: -log_view [15]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [15]PETSC ERROR: No support for this operation for this object type [15]PETSC ERROR: No method productsymbolic for Mat of type (null) [15]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [15]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [15]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [15]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [15]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [15]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [15]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [59]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [59]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [59]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [59]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [59]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [59]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [59]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [59]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [59]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [15]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001117 by madams Sun Oct 30 05:30:05 2022 [15]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [15]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [55]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001128 by madams Sun Oct 30 05:30:05 2022 [55]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [55]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [55]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [55]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [55]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [55]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [55]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [55]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [55]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [55]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [55]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [40]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001125 by madams Sun Oct 30 05:30:05 2022 [40]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [40]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [41]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [41]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [41]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [41]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [41]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [41]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [41]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [41]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [41]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [41]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [41]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [41]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [41]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [41]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [59]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [59]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [59]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [59]PETSC ERROR: #14 main() at ex13.c:178 [59]PETSC ERROR: PETSc Option Table entries: [59]PETSC ERROR: -benchmark_it 10 [59]PETSC ERROR: -dm_distribute [59]PETSC ERROR: -dm_mat_type aijkokkos [59]PETSC ERROR: -dm_plex_box_faces 4,4,4 [59]PETSC ERROR: -dm_plex_box_lower 0,0,0 [59]PETSC ERROR: -dm_plex_box_upper 2,2,2 [59]PETSC ERROR: -dm_plex_dim 3 [59]PETSC ERROR: -dm_plex_simplex 0 [59]PETSC ERROR: -dm_refine 4 [59]PETSC ERROR: -dm_vec_type kokkos [59]PETSC ERROR: -dm_view [59]PETSC ERROR: -ksp_converged_reason [59]PETSC ERROR: -ksp_max_it 200 [59]PETSC ERROR: -ksp_norm_type unpreconditioned [59]PETSC ERROR: -ksp_rtol 1.e-12 [59]PETSC ERROR: -ksp_type cg [59]PETSC ERROR: -log_view [15]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [15]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [15]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [15]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [15]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [15]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [15]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [15]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [15]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [59]PETSC ERROR: -mat_type aijkokkos [59]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [59]PETSC ERROR: -mg_levels_ksp_type chebyshev [59]PETSC ERROR: -mg_levels_pc_type jacobi [59]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [59]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [59]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [59]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [59]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [59]PETSC ERROR: -pc_gamg_process_eq_limit 400 [59]PETSC ERROR: -pc_gamg_repartition false [59]PETSC ERROR: -pc_gamg_reuse_interpolation true [59]PETSC ERROR: -pc_gamg_threshold 0.01 [59]PETSC ERROR: -pc_type gamg [59]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [59]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [59]PETSC ERROR: -petscpartitioner_type simple [59]PETSC ERROR: -potential_petscspace_degree 2 [59]PETSC ERROR: -snes_lag_jacobian -2 [59]PETSC ERROR: -snes_max_it 1 [59]PETSC ERROR: -snes_rtol 1.e-8 [59]PETSC ERROR: -snes_type ksponly [15]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [15]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [15]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [15]PETSC ERROR: #14 main() at ex13.c:178 [15]PETSC ERROR: PETSc Option Table entries: [15]PETSC ERROR: -benchmark_it 10 [15]PETSC ERROR: -dm_distribute [15]PETSC ERROR: -dm_mat_type aijkokkos [15]PETSC ERROR: -dm_plex_box_faces 4,4,4 [15]PETSC ERROR: -dm_plex_box_lower 0,0,0 [15]PETSC ERROR: -dm_plex_box_upper 2,2,2 [15]PETSC ERROR: -dm_plex_dim 3 [15]PETSC ERROR: -dm_plex_simplex 0 [15]PETSC ERROR: -dm_refine 4 [15]PETSC ERROR: -dm_vec_type kokkos [15]PETSC ERROR: -dm_view [15]PETSC ERROR: -ksp_converged_reason [15]PETSC ERROR: -ksp_max_it 200 [15]PETSC ERROR: -ksp_norm_type unpreconditioned [15]PETSC ERROR: -ksp_rtol 1.e-12 [15]PETSC ERROR: -ksp_type cg [15]PETSC ERROR: -log_view [20]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001120 by madams Sun Oct 30 05:30:05 2022 [20]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [20]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [15]PETSC ERROR: -mat_type aijkokkos [15]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [15]PETSC ERROR: -mg_levels_ksp_type chebyshev [15]PETSC ERROR: -mg_levels_pc_type jacobi [15]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [15]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [15]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [15]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [11]PETSC ERROR: -dm_plex_box_lower 0,0,0 [11]PETSC ERROR: -dm_plex_box_upper 2,2,2 [11]PETSC ERROR: -dm_plex_dim 3 [11]PETSC ERROR: -dm_plex_simplex 0 [11]PETSC ERROR: -dm_refine 4 [11]PETSC ERROR: -dm_vec_type kokkos [11]PETSC ERROR: -dm_view [11]PETSC ERROR: -ksp_converged_reason [11]PETSC ERROR: -ksp_max_it 200 [11]PETSC ERROR: -ksp_norm_type unpreconditioned [11]PETSC ERROR: -ksp_rtol 1.e-12 [11]PETSC ERROR: -ksp_type cg [11]PETSC ERROR: -log_view [11]PETSC ERROR: -mat_type aijkokkos [11]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [11]PETSC ERROR: -mg_levels_ksp_type chebyshev [11]PETSC ERROR: -mg_levels_pc_type jacobi [11]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [11]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [11]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [11]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [11]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [11]PETSC ERROR: -pc_gamg_process_eq_limit 400 [11]PETSC ERROR: -pc_gamg_repartition false [11]PETSC ERROR: -pc_gamg_reuse_interpolation true [11]PETSC ERROR: -pc_gamg_threshold 0.01 [11]PETSC ERROR: -pc_type gamg [11]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [11]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [11]PETSC ERROR: -petscpartitioner_type simple [11]PETSC ERROR: -potential_petscspace_degree 2 [11]PETSC ERROR: -snes_lag_jacobian -2 [11]PETSC ERROR: -snes_max_it 1 [11]PETSC ERROR: -snes_rtol 1.e-8 [11]PETSC ERROR: -snes_type ksponly [11]PETSC ERROR: -use_gpu_aware_mpi 0 [11]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 11] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001117] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [24]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001121 by madams Sun Oct 30 05:30:05 2022 [24]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [24]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [55]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [55]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [55]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [55]PETSC ERROR: #14 main() at ex13.c:178 [55]PETSC ERROR: PETSc Option Table entries: [55]PETSC ERROR: -benchmark_it 10 [55]PETSC ERROR: -dm_distribute [55]PETSC ERROR: -dm_mat_type aijkokkos [55]PETSC ERROR: -dm_plex_box_faces 4,4,4 [55]PETSC ERROR: -dm_plex_box_lower 0,0,0 [55]PETSC ERROR: -dm_plex_box_upper 2,2,2 [55]PETSC ERROR: -dm_plex_dim 3 [55]PETSC ERROR: -dm_plex_simplex 0 [55]PETSC ERROR: -dm_refine 4 [55]PETSC ERROR: -dm_vec_type kokkos [55]PETSC ERROR: -dm_view [55]PETSC ERROR: -ksp_converged_reason [55]PETSC ERROR: -ksp_max_it 200 [55]PETSC ERROR: -ksp_norm_type unpreconditioned [55]PETSC ERROR: -ksp_rtol 1.e-12 [55]PETSC ERROR: -ksp_type cg [55]PETSC ERROR: -log_view [41]PETSC ERROR: #14 main() at ex13.c:178 [41]PETSC ERROR: PETSc Option Table entries: [41]PETSC ERROR: -benchmark_it 10 [41]PETSC ERROR: -dm_distribute [41]PETSC ERROR: -dm_mat_type aijkokkos [41]PETSC ERROR: -dm_plex_box_faces 4,4,4 [41]PETSC ERROR: -dm_plex_box_lower 0,0,0 [41]PETSC ERROR: -dm_plex_box_upper 2,2,2 [41]PETSC ERROR: -dm_plex_dim 3 [41]PETSC ERROR: -dm_plex_simplex 0 [41]PETSC ERROR: -dm_refine 4 [41]PETSC ERROR: -dm_vec_type kokkos [41]PETSC ERROR: -dm_view [41]PETSC ERROR: -ksp_converged_reason [41]PETSC ERROR: -ksp_max_it 200 [41]PETSC ERROR: -ksp_norm_type unpreconditioned [41]PETSC ERROR: -ksp_rtol 1.e-12 [41]PETSC ERROR: -ksp_type cg [41]PETSC ERROR: -log_view [41]PETSC ERROR: -mat_type aijkokkos [41]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [41]PETSC ERROR: -mg_levels_ksp_type chebyshev [41]PETSC ERROR: -mg_levels_pc_type jacobi [41]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [41]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [24]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [24]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [24]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [24]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [55]PETSC ERROR: -mat_type aijkokkos [55]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [55]PETSC ERROR: -mg_levels_ksp_type chebyshev [55]PETSC ERROR: -mg_levels_pc_type jacobi [55]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [55]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [55]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [55]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [41]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [41]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [41]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [41]PETSC ERROR: -pc_gamg_process_eq_limit 400 [41]PETSC ERROR: -pc_gamg_repartition false [41]PETSC ERROR: -pc_gamg_reuse_interpolation true [41]PETSC ERROR: -pc_gamg_threshold 0.01 [41]PETSC ERROR: -pc_type gamg [41]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [41]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [25]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [25]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [25]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [25]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [42]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [42]PETSC ERROR: No support for this operation for this object type [42]PETSC ERROR: No method productsymbolic for Mat of type (null) [42]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [42]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [42]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [42]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [42]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [42]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [42]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [33]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [33]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [33]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [33]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001124 by madams Sun Oct 30 05:30:05 2022 [33]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [20]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [20]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [20]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [20]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [20]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [20]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [20]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [20]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [20]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [20]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [20]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [20]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [20]PETSC ERROR: #14 main() at ex13.c:178 [20]PETSC ERROR: PETSc Option Table entries: [20]PETSC ERROR: -benchmark_it 10 [20]PETSC ERROR: -dm_distribute [20]PETSC ERROR: -dm_mat_type aijkokkos [20]PETSC ERROR: -dm_plex_box_faces 4,4,4 [20]PETSC ERROR: -dm_plex_box_lower 0,0,0 [20]PETSC ERROR: -dm_plex_box_upper 2,2,2 [20]PETSC ERROR: -dm_plex_dim 3 [20]PETSC ERROR: -dm_plex_simplex 0 [20]PETSC ERROR: -dm_refine 4 [20]PETSC ERROR: -dm_vec_type kokkos [20]PETSC ERROR: -dm_view [20]PETSC ERROR: -ksp_converged_reason [33]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [33]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [33]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [33]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [33]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [33]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [33]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [33]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [33]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [33]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [33]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [33]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [33]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [33]PETSC ERROR: #14 main() at ex13.c:178 [33]PETSC ERROR: PETSc Option Table entries: [33]PETSC ERROR: -benchmark_it 10 [33]PETSC ERROR: -dm_distribute [33]PETSC ERROR: -dm_mat_type aijkokkos [33]PETSC ERROR: -dm_plex_box_faces 4,4,4 [33]PETSC ERROR: -dm_plex_box_lower 0,0,0 [33]PETSC ERROR: -dm_plex_box_upper 2,2,2 [33]PETSC ERROR: -dm_plex_dim 3 [33]PETSC ERROR: -dm_plex_simplex 0 [33]PETSC ERROR: -dm_refine 4 [33]PETSC ERROR: -dm_vec_type kokkos [33]PETSC ERROR: -dm_view [33]PETSC ERROR: -ksp_converged_reason [33]PETSC ERROR: -ksp_max_it 200 [35]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [35]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [35]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [35]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [25]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [25]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [25]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [25]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [25]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [25]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [25]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [25]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [25]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [25]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [49]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [49]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [49]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [49]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [49]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [49]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [49]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [49]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [49]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [25]PETSC ERROR: #14 main() at ex13.c:178 [25]PETSC ERROR: PETSc Option Table entries: [25]PETSC ERROR: -benchmark_it 10 [25]PETSC ERROR: -dm_distribute [25]PETSC ERROR: -dm_mat_type aijkokkos [25]PETSC ERROR: -dm_plex_box_faces 4,4,4 [25]PETSC ERROR: -dm_plex_box_lower 0,0,0 [25]PETSC ERROR: -dm_plex_box_upper 2,2,2 [25]PETSC ERROR: -dm_plex_dim 3 [25]PETSC ERROR: -dm_plex_simplex 0 [25]PETSC ERROR: -dm_refine 4 [25]PETSC ERROR: -dm_vec_type kokkos [25]PETSC ERROR: -dm_view [25]PETSC ERROR: -ksp_converged_reason [25]PETSC ERROR: -ksp_max_it 200 [25]PETSC ERROR: -ksp_norm_type unpreconditioned [25]PETSC ERROR: -ksp_rtol 1.e-12 [25]PETSC ERROR: -ksp_type cg [25]PETSC ERROR: -log_view [25]PETSC ERROR: -mat_type aijkokkos [25]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [25]PETSC ERROR: -mg_levels_ksp_type chebyshev [25]PETSC ERROR: -mg_levels_pc_type jacobi [25]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [25]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [49]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [49]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [49]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [49]PETSC ERROR: #14 main() at ex13.c:178 [49]PETSC ERROR: PETSc Option Table entries: [49]PETSC ERROR: -benchmark_it 10 [49]PETSC ERROR: -dm_distribute [49]PETSC ERROR: -dm_mat_type aijkokkos [49]PETSC ERROR: -dm_plex_box_faces 4,4,4 [49]PETSC ERROR: -dm_plex_box_lower 0,0,0 [49]PETSC ERROR: -dm_plex_box_upper 2,2,2 [49]PETSC ERROR: -dm_plex_dim 3 [49]PETSC ERROR: -dm_plex_simplex 0 [49]PETSC ERROR: -dm_refine 4 [49]PETSC ERROR: -dm_vec_type kokkos [49]PETSC ERROR: -dm_view [49]PETSC ERROR: -ksp_converged_reason [49]PETSC ERROR: -ksp_max_it 200 [49]PETSC ERROR: -ksp_norm_type unpreconditioned [49]PETSC ERROR: -ksp_rtol 1.e-12 [49]PETSC ERROR: -ksp_type cg [49]PETSC ERROR: -log_view [42]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001125 by madams Sun Oct 30 05:30:05 2022 [42]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [42]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [25]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [25]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [25]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [25]PETSC ERROR: -pc_gamg_process_eq_limit 400 [25]PETSC ERROR: -pc_gamg_repartition false [25]PETSC ERROR: -pc_gamg_reuse_interpolation true [25]PETSC ERROR: -pc_gamg_threshold 0.01 [25]PETSC ERROR: -pc_type gamg [25]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [25]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [25]PETSC ERROR: -petscpartitioner_type simple [25]PETSC ERROR: -potential_petscspace_degree 2 [25]PETSC ERROR: -snes_lag_jacobian -2 [25]PETSC ERROR: -snes_max_it 1 [25]PETSC ERROR: -snes_rtol 1.e-8 [25]PETSC ERROR: -snes_type ksponly [25]PETSC ERROR: -use_gpu_aware_mpi 0 [49]PETSC ERROR: -mat_type aijkokkos [49]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [49]PETSC ERROR: -mg_levels_ksp_type chebyshev [49]PETSC ERROR: -mg_levels_pc_type jacobi [49]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [49]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [49]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [49]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [49]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [49]PETSC ERROR: -pc_gamg_process_eq_limit 400 [49]PETSC ERROR: -pc_gamg_repartition false [49]PETSC ERROR: -pc_gamg_reuse_interpolation true [49]PETSC ERROR: -pc_gamg_threshold 0.01 [49]PETSC ERROR: -pc_type gamg [49]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [49]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [49]PETSC ERROR: -petscpartitioner_type simple [49]PETSC ERROR: -potential_petscspace_degree 2 [49]PETSC ERROR: -snes_lag_jacobian -2 [42]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [42]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [42]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [42]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [42]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [42]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [42]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [42]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [42]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [50]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [50]PETSC ERROR: No support for this operation for this object type [50]PETSC ERROR: No method productsymbolic for Mat of type (null) [50]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [50]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [50]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [50]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [50]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [50]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [50]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [42]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [42]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [42]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [42]PETSC ERROR: #14 main() at ex13.c:178 [42]PETSC ERROR: PETSc Option Table entries: [42]PETSC ERROR: -benchmark_it 10 [42]PETSC ERROR: -dm_distribute [42]PETSC ERROR: -dm_mat_type aijkokkos [42]PETSC ERROR: -dm_plex_box_faces 4,4,4 [42]PETSC ERROR: -dm_plex_box_lower 0,0,0 [42]PETSC ERROR: -dm_plex_box_upper 2,2,2 [42]PETSC ERROR: -dm_plex_dim 3 [42]PETSC ERROR: -dm_plex_simplex 0 [42]PETSC ERROR: -dm_refine 4 [42]PETSC ERROR: -dm_vec_type kokkos [50]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001128 by madams Sun Oct 30 05:30:05 2022 [50]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [50]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [51]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [51]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [51]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [51]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [51]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [51]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [51]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [51]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [51]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [51]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [51]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [51]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [51]PETSC ERROR: #14 main() at ex13.c:178 [51]PETSC ERROR: PETSc Option Table entries: [51]PETSC ERROR: -benchmark_it 10 [51]PETSC ERROR: -dm_distribute [51]PETSC ERROR: -dm_mat_type aijkokkos [51]PETSC ERROR: -dm_plex_box_faces 4,4,4 [51]PETSC ERROR: -dm_plex_box_lower 0,0,0 [51]PETSC ERROR: -dm_plex_box_upper 2,2,2 [51]PETSC ERROR: -dm_plex_dim 3 [51]PETSC ERROR: -dm_plex_simplex 0 [51]PETSC ERROR: -dm_refine 4 [51]PETSC ERROR: -dm_vec_type kokkos [51]PETSC ERROR: -dm_view [51]PETSC ERROR: -ksp_converged_reason [51]PETSC ERROR: -ksp_max_it 200 [51]PETSC ERROR: -ksp_norm_type unpreconditioned [51]PETSC ERROR: -ksp_rtol 1.e-12 [51]PETSC ERROR: -ksp_type cg [51]PETSC ERROR: -log_view [51]PETSC ERROR: -mat_type aijkokkos [51]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [51]PETSC ERROR: -mg_levels_ksp_type chebyshev [51]PETSC ERROR: -mg_levels_pc_type jacobi [51]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [51]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [51]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [51]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [51]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [51]PETSC ERROR: -pc_gamg_process_eq_limit 400 [51]PETSC ERROR: -pc_gamg_repartition false [51]PETSC ERROR: -pc_gamg_reuse_interpolation true [51]PETSC ERROR: -pc_gamg_threshold 0.01 [51]PETSC ERROR: -pc_type gamg [51]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [51]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [51]PETSC ERROR: -petscpartitioner_type simple [51]PETSC ERROR: -potential_petscspace_degree 2 [51]PETSC ERROR: -snes_lag_jacobian -2 [51]PETSC ERROR: -snes_max_it 1 [51]PETSC ERROR: -snes_rtol 1.e-8 [51]PETSC ERROR: -snes_type ksponly [51]PETSC ERROR: -use_gpu_aware_mpi 0 [51]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- [22]PETSC ERROR: -dm_plex_box_upper 2,2,2 [22]PETSC ERROR: -dm_plex_dim 3 [22]PETSC ERROR: -dm_plex_simplex 0 [22]PETSC ERROR: -dm_refine 4 [22]PETSC ERROR: -dm_vec_type kokkos [22]PETSC ERROR: -dm_view [22]PETSC ERROR: -ksp_converged_reason [22]PETSC ERROR: -ksp_max_it 200 [22]PETSC ERROR: -ksp_norm_type unpreconditioned [22]PETSC ERROR: -ksp_rtol 1.e-12 [22]PETSC ERROR: -ksp_type cg [22]PETSC ERROR: -log_view [22]PETSC ERROR: -mat_type aijkokkos [22]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [22]PETSC ERROR: -mg_levels_ksp_type chebyshev [22]PETSC ERROR: -mg_levels_pc_type jacobi [22]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [22]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [22]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [22]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [22]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [22]PETSC ERROR: -pc_gamg_process_eq_limit 400 [22]PETSC ERROR: -pc_gamg_repartition false [22]PETSC ERROR: -pc_gamg_reuse_interpolation true [22]PETSC ERROR: -pc_gamg_threshold 0.01 [22]PETSC ERROR: -pc_type gamg [22]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [22]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [22]PETSC ERROR: -petscpartitioner_type simple [22]PETSC ERROR: -potential_petscspace_degree 2 [22]PETSC ERROR: -snes_lag_jacobian -2 [22]PETSC ERROR: -snes_max_it 1 [22]PETSC ERROR: -snes_rtol 1.e-8 [22]PETSC ERROR: -snes_type ksponly [22]PETSC ERROR: -use_gpu_aware_mpi 0 [22]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 22] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001120] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [23]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001120 by madams Sun Oct 30 05:30:05 2022 [23]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [23]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [23]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [23]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [23]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [23]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [23]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [23]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [23]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [23]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [23]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [12]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [12]PETSC ERROR: -mg_levels_ksp_type chebyshev [12]PETSC ERROR: -mg_levels_pc_type jacobi [12]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [12]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [12]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [12]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [12]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [12]PETSC ERROR: -pc_gamg_process_eq_limit 400 [12]PETSC ERROR: -pc_gamg_repartition false [12]PETSC ERROR: -pc_gamg_reuse_interpolation true [12]PETSC ERROR: -pc_gamg_threshold 0.01 [12]PETSC ERROR: -pc_type gamg [12]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [12]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [12]PETSC ERROR: -petscpartitioner_type simple [12]PETSC ERROR: -potential_petscspace_degree 2 [12]PETSC ERROR: -snes_lag_jacobian -2 [12]PETSC ERROR: -snes_max_it 1 [12]PETSC ERROR: -snes_rtol 1.e-8 [12]PETSC ERROR: -snes_type ksponly [12]PETSC ERROR: -use_gpu_aware_mpi 0 [12]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 12] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001117] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [15]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [15]PETSC ERROR: -pc_gamg_process_eq_limit 400 [15]PETSC ERROR: -pc_gamg_repartition false [15]PETSC ERROR: -pc_gamg_reuse_interpolation true [15]PETSC ERROR: -pc_gamg_threshold 0.01 [15]PETSC ERROR: -pc_type gamg [15]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [15]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [15]PETSC ERROR: -petscpartitioner_type simple [15]PETSC ERROR: -potential_petscspace_degree 2 [15]PETSC ERROR: -snes_lag_jacobian -2 [15]PETSC ERROR: -snes_max_it 1 [15]PETSC ERROR: -snes_rtol 1.e-8 [15]PETSC ERROR: -snes_type ksponly [15]PETSC ERROR: -use_gpu_aware_mpi 0 [15]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 15] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001117] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [35]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [35]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [35]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [35]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [35]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [35]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [35]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [35]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [35]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [35]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [35]PETSC ERROR: #14 main() at ex13.c:178 [35]PETSC ERROR: PETSc Option Table entries: [35]PETSC ERROR: -benchmark_it 10 [35]PETSC ERROR: -dm_distribute [35]PETSC ERROR: -dm_mat_type aijkokkos [35]PETSC ERROR: -dm_plex_box_faces 4,4,4 [35]PETSC ERROR: -dm_plex_box_lower 0,0,0 [35]PETSC ERROR: -dm_plex_box_upper 2,2,2 [35]PETSC ERROR: -dm_plex_dim 3 [35]PETSC ERROR: -dm_plex_simplex 0 [35]PETSC ERROR: -dm_refine 4 [35]PETSC ERROR: -dm_vec_type kokkos [35]PETSC ERROR: -dm_view [35]PETSC ERROR: -ksp_converged_reason [35]PETSC ERROR: -ksp_max_it 200 [35]PETSC ERROR: -ksp_norm_type unpreconditioned [35]PETSC ERROR: -ksp_rtol 1.e-12 [35]PETSC ERROR: -ksp_type cg [35]PETSC ERROR: -log_view [35]PETSC ERROR: -mat_type aijkokkos [35]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [35]PETSC ERROR: -mg_levels_ksp_type chebyshev [35]PETSC ERROR: -mg_levels_pc_type jacobi [35]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [35]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [35]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [35]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [36]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [36]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [36]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [36]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [36]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [36]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [36]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [36]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [36]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [26]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [26]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [26]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [26]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [26]PETSC ERROR: #14 main() at ex13.c:178 [26]PETSC ERROR: PETSc Option Table entries: [26]PETSC ERROR: -benchmark_it 10 [26]PETSC ERROR: -dm_distribute [26]PETSC ERROR: -dm_mat_type aijkokkos [26]PETSC ERROR: -dm_plex_box_faces 4,4,4 [26]PETSC ERROR: -dm_plex_box_lower 0,0,0 [26]PETSC ERROR: -dm_plex_box_upper 2,2,2 [26]PETSC ERROR: -dm_plex_dim 3 [26]PETSC ERROR: -dm_plex_simplex 0 [26]PETSC ERROR: -dm_refine 4 [26]PETSC ERROR: -dm_vec_type kokkos [26]PETSC ERROR: -dm_view [26]PETSC ERROR: -ksp_converged_reason [26]PETSC ERROR: -ksp_max_it 200 [26]PETSC ERROR: -ksp_norm_type unpreconditioned [43]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [43]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [43]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [43]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [43]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [43]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [43]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [43]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [43]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [26]PETSC ERROR: -ksp_rtol 1.e-12 [26]PETSC ERROR: -ksp_type cg [26]PETSC ERROR: -log_view [26]PETSC ERROR: -mat_type aijkokkos [26]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [26]PETSC ERROR: -mg_levels_ksp_type chebyshev [26]PETSC ERROR: -mg_levels_pc_type jacobi [26]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [26]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [26]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [26]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [26]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [26]PETSC ERROR: -pc_gamg_process_eq_limit 400 [26]PETSC ERROR: -pc_gamg_repartition false [26]PETSC ERROR: -pc_gamg_reuse_interpolation true [26]PETSC ERROR: -pc_gamg_threshold 0.01 [26]PETSC ERROR: -pc_type gamg [26]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [26]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [26]PETSC ERROR: -petscpartitioner_type simple [26]PETSC ERROR: -potential_petscspace_degree 2 [26]PETSC ERROR: -snes_lag_jacobian -2 [43]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [43]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [43]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [43]PETSC ERROR: #14 main() at ex13.c:178 [43]PETSC ERROR: PETSc Option Table entries: [43]PETSC ERROR: -benchmark_it 10 [43]PETSC ERROR: -dm_distribute [43]PETSC ERROR: -dm_mat_type aijkokkos [43]PETSC ERROR: -dm_plex_box_faces 4,4,4 [43]PETSC ERROR: -dm_plex_box_lower 0,0,0 [43]PETSC ERROR: -dm_plex_box_upper 2,2,2 [43]PETSC ERROR: -dm_plex_dim 3 [43]PETSC ERROR: -dm_plex_simplex 0 [43]PETSC ERROR: -dm_refine 4 [43]PETSC ERROR: -dm_vec_type kokkos [43]PETSC ERROR: -dm_view [43]PETSC ERROR: -ksp_converged_reason [43]PETSC ERROR: -ksp_max_it 200 [43]PETSC ERROR: -ksp_norm_type unpreconditioned [43]PETSC ERROR: -ksp_rtol 1.e-12 [43]PETSC ERROR: -ksp_type cg [43]PETSC ERROR: -log_view [26]PETSC ERROR: -snes_max_it 1 [26]PETSC ERROR: -snes_rtol 1.e-8 [26]PETSC ERROR: -snes_type ksponly [26]PETSC ERROR: -use_gpu_aware_mpi 0 [26]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 26] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001121] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [43]PETSC ERROR: -mat_type aijkokkos [43]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [43]PETSC ERROR: -mg_levels_ksp_type chebyshev [43]PETSC ERROR: -mg_levels_pc_type jacobi [43]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [43]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [43]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [43]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [43]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [43]PETSC ERROR: -pc_gamg_process_eq_limit 400 [43]PETSC ERROR: -pc_gamg_repartition false [43]PETSC ERROR: -pc_gamg_reuse_interpolation true [43]PETSC ERROR: -pc_gamg_threshold 0.01 [43]PETSC ERROR: -pc_type gamg [43]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [43]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [43]PETSC ERROR: -petscpartitioner_type simple [43]PETSC ERROR: -potential_petscspace_degree 2 [43]PETSC ERROR: -snes_lag_jacobian -2 [43]PETSC ERROR: -snes_max_it 1 [43]PETSC ERROR: -snes_rtol 1.e-8 [43]PETSC ERROR: -snes_type ksponly [27]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [27]PETSC ERROR: No support for this operation for this object type [27]PETSC ERROR: No method productsymbolic for Mat of type (null) [27]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [27]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [27]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [27]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [27]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [27]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [27]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [43]PETSC ERROR: -use_gpu_aware_mpi 0 [43]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 43] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001125] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [59]PETSC ERROR: -use_gpu_aware_mpi 0 [61]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [61]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [61]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [61]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [61]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [61]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [61]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [61]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [61]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [61]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [61]PETSC ERROR: #14 main() at ex13.c:178 [61]PETSC ERROR: PETSc Option Table entries: [61]PETSC ERROR: -benchmark_it 10 [61]PETSC ERROR: -dm_distribute [61]PETSC ERROR: -dm_mat_type aijkokkos [61]PETSC ERROR: -dm_plex_box_faces 4,4,4 [61]PETSC ERROR: -dm_plex_box_lower 0,0,0 [61]PETSC ERROR: -dm_plex_box_upper 2,2,2 [61]PETSC ERROR: -dm_plex_dim 3 [61]PETSC ERROR: -dm_plex_simplex 0 [61]PETSC ERROR: -dm_refine 4 [61]PETSC ERROR: -dm_vec_type kokkos [61]PETSC ERROR: -dm_view [61]PETSC ERROR: -ksp_converged_reason [61]PETSC ERROR: -ksp_max_it 200 [61]PETSC ERROR: -ksp_norm_type unpreconditioned [61]PETSC ERROR: -ksp_rtol 1.e-12 [61]PETSC ERROR: -ksp_type cg [61]PETSC ERROR: -log_view [61]PETSC ERROR: -mat_type aijkokkos [61]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [61]PETSC ERROR: -mg_levels_ksp_type chebyshev [61]PETSC ERROR: -mg_levels_pc_type jacobi [61]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [61]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [61]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [61]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [61]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [61]PETSC ERROR: -pc_gamg_process_eq_limit 400 [61]PETSC ERROR: -pc_gamg_repartition false [61]PETSC ERROR: -pc_gamg_reuse_interpolation true [61]PETSC ERROR: -pc_gamg_threshold 0.01 [61]PETSC ERROR: -pc_type gamg [61]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [61]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [61]PETSC ERROR: -petscpartitioner_type simple [61]PETSC ERROR: -potential_petscspace_degree 2 [61]PETSC ERROR: -snes_lag_jacobian -2 [61]PETSC ERROR: -snes_max_it 1 [61]PETSC ERROR: -snes_rtol 1.e-8 [61]PETSC ERROR: -snes_type ksponly [61]PETSC ERROR: -use_gpu_aware_mpi 0 [61]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- [63]PETSC ERROR: -ksp_max_it 200 [63]PETSC ERROR: -ksp_norm_type unpreconditioned [63]PETSC ERROR: -ksp_rtol 1.e-12 [63]PETSC ERROR: -ksp_type cg [63]PETSC ERROR: -log_view [63]PETSC ERROR: -mat_type aijkokkos [63]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [63]PETSC ERROR: -mg_levels_ksp_type chebyshev [63]PETSC ERROR: -mg_levels_pc_type jacobi [63]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [63]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [63]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [63]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [63]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [63]PETSC ERROR: -pc_gamg_process_eq_limit 400 [63]PETSC ERROR: -pc_gamg_repartition false [63]PETSC ERROR: -pc_gamg_reuse_interpolation true [63]PETSC ERROR: -pc_gamg_threshold 0.01 [63]PETSC ERROR: -pc_type gamg [63]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [63]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [63]PETSC ERROR: -petscpartitioner_type simple [36]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [36]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [36]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [36]PETSC ERROR: #14 main() at ex13.c:178 [36]PETSC ERROR: PETSc Option Table entries: [36]PETSC ERROR: -benchmark_it 10 [36]PETSC ERROR: -dm_distribute [36]PETSC ERROR: -dm_mat_type aijkokkos [36]PETSC ERROR: -dm_plex_box_faces 4,4,4 [36]PETSC ERROR: -dm_plex_box_lower 0,0,0 [36]PETSC ERROR: -dm_plex_box_upper 2,2,2 [36]PETSC ERROR: -dm_plex_dim 3 [36]PETSC ERROR: -dm_plex_simplex 0 [36]PETSC ERROR: -dm_refine 4 [36]PETSC ERROR: -dm_vec_type kokkos [36]PETSC ERROR: -dm_view [36]PETSC ERROR: -ksp_converged_reason [36]PETSC ERROR: -ksp_max_it 200 [36]PETSC ERROR: -ksp_norm_type unpreconditioned [36]PETSC ERROR: -ksp_rtol 1.e-12 [36]PETSC ERROR: -ksp_type cg [36]PETSC ERROR: -log_view [63]PETSC ERROR: -potential_petscspace_degree 2 [63]PETSC ERROR: -snes_lag_jacobian -2 [63]PETSC ERROR: -snes_max_it 1 [63]PETSC ERROR: -snes_rtol 1.e-8 [63]PETSC ERROR: -snes_type ksponly [63]PETSC ERROR: -use_gpu_aware_mpi 0 [63]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 63] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001129] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [36]PETSC ERROR: -mat_type aijkokkos [36]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [36]PETSC ERROR: -mg_levels_ksp_type chebyshev [36]PETSC ERROR: -mg_levels_pc_type jacobi [36]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [36]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [36]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [36]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [36]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [36]PETSC ERROR: -pc_gamg_process_eq_limit 400 [36]PETSC ERROR: -pc_gamg_repartition false [36]PETSC ERROR: -pc_gamg_reuse_interpolation true [36]PETSC ERROR: -pc_gamg_threshold 0.01 [36]PETSC ERROR: -pc_type gamg [36]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [36]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [36]PETSC ERROR: -petscpartitioner_type simple [36]PETSC ERROR: -potential_petscspace_degree 2 [36]PETSC ERROR: -snes_lag_jacobian -2 [36]PETSC ERROR: -snes_max_it 1 [36]PETSC ERROR: -snes_rtol 1.e-8 [56]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [56]PETSC ERROR: No support for this operation for this object type [56]PETSC ERROR: No method productsymbolic for Mat of type (null) [56]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [56]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [56]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [56]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [56]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [56]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [56]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 MPICH ERROR [Rank 38] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001124] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [56]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001129 by madams Sun Oct 30 05:30:05 2022 [56]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [56]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [39]PETSC ERROR: -dm_mat_type aijkokkos [39]PETSC ERROR: -dm_plex_box_faces 4,4,4 [39]PETSC ERROR: -dm_plex_box_lower 0,0,0 [39]PETSC ERROR: -dm_plex_box_upper 2,2,2 [39]PETSC ERROR: -dm_plex_dim 3 [39]PETSC ERROR: -dm_plex_simplex 0 [39]PETSC ERROR: -dm_refine 4 [39]PETSC ERROR: -dm_vec_type kokkos [39]PETSC ERROR: -dm_view [39]PETSC ERROR: -ksp_converged_reason [39]PETSC ERROR: -ksp_max_it 200 [39]PETSC ERROR: -ksp_norm_type unpreconditioned [39]PETSC ERROR: -ksp_rtol 1.e-12 [39]PETSC ERROR: -ksp_type cg [39]PETSC ERROR: -log_view [39]PETSC ERROR: -mat_type aijkokkos [39]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [39]PETSC ERROR: -mg_levels_ksp_type chebyshev [39]PETSC ERROR: -mg_levels_pc_type jacobi [39]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [39]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [39]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [39]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [39]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [39]PETSC ERROR: -pc_gamg_process_eq_limit 400 [56]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [56]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [56]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [56]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [56]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [56]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [56]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [39]PETSC ERROR: -pc_gamg_repartition false [39]PETSC ERROR: -pc_gamg_reuse_interpolation true [39]PETSC ERROR: -pc_gamg_threshold 0.01 [39]PETSC ERROR: -pc_type gamg [39]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [39]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [39]PETSC ERROR: -petscpartitioner_type simple [39]PETSC ERROR: -potential_petscspace_degree 2 [39]PETSC ERROR: -snes_lag_jacobian -2 [39]PETSC ERROR: -snes_max_it 1 [39]PETSC ERROR: -snes_rtol 1.e-8 [39]PETSC ERROR: -snes_type ksponly [39]PETSC ERROR: -use_gpu_aware_mpi 0 [39]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 39] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001124] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [32]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [32]PETSC ERROR: No support for this operation for this object type [32]PETSC ERROR: No method productsymbolic for Mat of type (null) [32]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [32]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [32]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [32]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [32]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [32]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [32]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [32]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001124 by madams Sun Oct 30 05:30:05 2022 [32]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [32]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [32]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [32]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [32]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [32]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [32]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [33]PETSC ERROR: -ksp_norm_type unpreconditioned [33]PETSC ERROR: -ksp_rtol 1.e-12 [33]PETSC ERROR: -ksp_type cg [33]PETSC ERROR: -log_view [33]PETSC ERROR: -mat_type aijkokkos [33]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [33]PETSC ERROR: -mg_levels_ksp_type chebyshev [33]PETSC ERROR: -mg_levels_pc_type jacobi [33]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [33]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [33]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [33]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [33]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [33]PETSC ERROR: -pc_gamg_process_eq_limit 400 [33]PETSC ERROR: -pc_gamg_repartition false [33]PETSC ERROR: -pc_gamg_reuse_interpolation true [33]PETSC ERROR: -pc_gamg_threshold 0.01 [33]PETSC ERROR: -pc_type gamg [33]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [33]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [33]PETSC ERROR: -petscpartitioner_type simple [33]PETSC ERROR: -potential_petscspace_degree 2 [27]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001121 by madams Sun Oct 30 05:30:05 2022 [27]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [27]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [27]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [27]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [27]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [27]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [27]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [27]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [27]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [27]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [27]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [27]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [27]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [27]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [27]PETSC ERROR: #14 main() at ex13.c:178 [27]PETSC ERROR: PETSc Option Table entries: [27]PETSC ERROR: -benchmark_it 10 [27]PETSC ERROR: -dm_distribute [27]PETSC ERROR: -dm_mat_type aijkokkos [27]PETSC ERROR: -dm_plex_box_faces 4,4,4 [27]PETSC ERROR: -dm_plex_box_lower 0,0,0 [27]PETSC ERROR: -dm_plex_box_upper 2,2,2 [27]PETSC ERROR: -dm_plex_dim 3 [27]PETSC ERROR: -dm_plex_simplex 0 [27]PETSC ERROR: -dm_refine 4 [27]PETSC ERROR: -dm_vec_type kokkos [27]PETSC ERROR: -dm_view [27]PETSC ERROR: -ksp_converged_reason [27]PETSC ERROR: -ksp_max_it 200 [27]PETSC ERROR: -ksp_norm_type unpreconditioned [27]PETSC ERROR: -ksp_rtol 1.e-12 [27]PETSC ERROR: -ksp_type cg [27]PETSC ERROR: -log_view [23]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [23]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [23]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [23]PETSC ERROR: #14 main() at ex13.c:178 [23]PETSC ERROR: PETSc Option Table entries: [23]PETSC ERROR: -benchmark_it 10 [23]PETSC ERROR: -dm_distribute [23]PETSC ERROR: -dm_mat_type aijkokkos [23]PETSC ERROR: -dm_plex_box_faces 4,4,4 [23]PETSC ERROR: -dm_plex_box_lower 0,0,0 [23]PETSC ERROR: -dm_plex_box_upper 2,2,2 [23]PETSC ERROR: -dm_plex_dim 3 [23]PETSC ERROR: -dm_plex_simplex 0 [23]PETSC ERROR: -dm_refine 4 [23]PETSC ERROR: -dm_vec_type kokkos [23]PETSC ERROR: -dm_view [23]PETSC ERROR: -ksp_converged_reason [23]PETSC ERROR: -ksp_max_it 200 [23]PETSC ERROR: -ksp_norm_type unpreconditioned [23]PETSC ERROR: -ksp_rtol 1.e-12 [23]PETSC ERROR: -ksp_type cg [23]PETSC ERROR: -log_view [23]PETSC ERROR: -mat_type aijkokkos [23]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [23]PETSC ERROR: -mg_levels_ksp_type chebyshev [23]PETSC ERROR: -mg_levels_pc_type jacobi [23]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [23]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [23]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [23]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [23]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [23]PETSC ERROR: -pc_gamg_process_eq_limit 400 [23]PETSC ERROR: -pc_gamg_repartition false [23]PETSC ERROR: -pc_gamg_reuse_interpolation true [23]PETSC ERROR: -pc_gamg_threshold 0.01 [23]PETSC ERROR: -pc_type gamg [23]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [23]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [23]PETSC ERROR: -petscpartitioner_type simple [23]PETSC ERROR: -potential_petscspace_degree 2 [23]PETSC ERROR: -snes_lag_jacobian -2 [23]PETSC ERROR: -snes_max_it 1 [23]PETSC ERROR: -snes_rtol 1.e-8 [23]PETSC ERROR: -snes_type ksponly [23]PETSC ERROR: -use_gpu_aware_mpi 0 [23]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- [16]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [16]PETSC ERROR: No support for this operation for this object type [16]PETSC ERROR: No method productsymbolic for Mat of type (null) [16]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [16]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [16]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [16]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [16]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [16]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [16]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [57]PETSC ERROR: -mat_type aijkokkos [57]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [57]PETSC ERROR: -mg_levels_ksp_type chebyshev [57]PETSC ERROR: -mg_levels_pc_type jacobi [57]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [57]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [57]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [57]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [57]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [57]PETSC ERROR: -pc_gamg_process_eq_limit 400 [57]PETSC ERROR: -pc_gamg_repartition false [57]PETSC ERROR: -pc_gamg_reuse_interpolation true [57]PETSC ERROR: -pc_gamg_threshold 0.01 [57]PETSC ERROR: -pc_type gamg [57]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [57]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [57]PETSC ERROR: -petscpartitioner_type simple [57]PETSC ERROR: -potential_petscspace_degree 2 [57]PETSC ERROR: -snes_lag_jacobian -2 [57]PETSC ERROR: -snes_max_it 1 [57]PETSC ERROR: -snes_rtol 1.e-8 [57]PETSC ERROR: -snes_type ksponly [57]PETSC ERROR: -use_gpu_aware_mpi 0 [57]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 57] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001129] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [27]PETSC ERROR: -mat_type aijkokkos [27]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [27]PETSC ERROR: -mg_levels_ksp_type chebyshev [27]PETSC ERROR: -mg_levels_pc_type jacobi [27]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [27]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [27]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [27]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 MPICH ERROR [Rank 52] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001128] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [28]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [28]PETSC ERROR: No support for this operation for this object type [28]PETSC ERROR: No method productsymbolic for Mat of type (null) [28]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [28]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [28]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [28]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [28]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [28]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [28]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [53]PETSC ERROR: -ksp_converged_reason [53]PETSC ERROR: -ksp_max_it 200 [53]PETSC ERROR: -ksp_norm_type unpreconditioned [53]PETSC ERROR: -ksp_rtol 1.e-12 [53]PETSC ERROR: -ksp_type cg [53]PETSC ERROR: -log_view [53]PETSC ERROR: -mat_type aijkokkos [53]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [53]PETSC ERROR: -mg_levels_ksp_type chebyshev [53]PETSC ERROR: -mg_levels_pc_type jacobi [53]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [53]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [53]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [53]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [53]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [53]PETSC ERROR: -pc_gamg_process_eq_limit 400 [53]PETSC ERROR: -pc_gamg_repartition false [53]PETSC ERROR: -pc_gamg_reuse_interpolation true [53]PETSC ERROR: -pc_gamg_threshold 0.01 [53]PETSC ERROR: -pc_type gamg [53]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [53]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [28]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001121 by madams Sun Oct 30 05:30:05 2022 [28]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [28]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [53]PETSC ERROR: -petscpartitioner_type simple [53]PETSC ERROR: -potential_petscspace_degree 2 [53]PETSC ERROR: -snes_lag_jacobian -2 [53]PETSC ERROR: -snes_max_it 1 [53]PETSC ERROR: -snes_rtol 1.e-8 [53]PETSC ERROR: -snes_type ksponly [53]PETSC ERROR: -use_gpu_aware_mpi 0 [53]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 53] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001128] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [28]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [28]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [28]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [28]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [28]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [28]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [28]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [28]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [28]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [55]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [55]PETSC ERROR: -pc_gamg_process_eq_limit 400 [55]PETSC ERROR: -pc_gamg_repartition false [55]PETSC ERROR: -pc_gamg_reuse_interpolation true [55]PETSC ERROR: -pc_gamg_threshold 0.01 [55]PETSC ERROR: -pc_type gamg [55]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [55]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [55]PETSC ERROR: -petscpartitioner_type simple [55]PETSC ERROR: -potential_petscspace_degree 2 [55]PETSC ERROR: -snes_lag_jacobian -2 [55]PETSC ERROR: -snes_max_it 1 [55]PETSC ERROR: -snes_rtol 1.e-8 [55]PETSC ERROR: -snes_type ksponly [55]PETSC ERROR: -use_gpu_aware_mpi 0 [55]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 55] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001128] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [48]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [48]PETSC ERROR: No support for this operation for this object type [48]PETSC ERROR: No method productsymbolic for Mat of type (null) [48]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [48]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [48]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [48]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [48]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [48]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [48]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [48]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001128 by madams Sun Oct 30 05:30:05 2022 [48]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [48]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [48]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [48]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [48]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [48]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [48]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [48]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [48]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [48]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [48]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [48]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [48]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [48]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [48]PETSC ERROR: #14 main() at ex13.c:178 [48]PETSC ERROR: PETSc Option Table entries: [48]PETSC ERROR: -benchmark_it 10 [49]PETSC ERROR: -snes_max_it 1 [49]PETSC ERROR: -snes_rtol 1.e-8 [49]PETSC ERROR: -snes_type ksponly [49]PETSC ERROR: -use_gpu_aware_mpi 0 [49]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 49] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001128] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [50]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [50]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [50]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [50]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [50]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [50]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [50]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [50]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [50]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [50]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [50]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [50]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [50]PETSC ERROR: #14 main() at ex13.c:178 [50]PETSC ERROR: PETSc Option Table entries: [50]PETSC ERROR: -benchmark_it 10 [50]PETSC ERROR: -dm_distribute [50]PETSC ERROR: -dm_mat_type aijkokkos [50]PETSC ERROR: -dm_plex_box_faces 4,4,4 [50]PETSC ERROR: -dm_plex_box_lower 0,0,0 [50]PETSC ERROR: -dm_plex_box_upper 2,2,2 [50]PETSC ERROR: -dm_plex_dim 3 [50]PETSC ERROR: -dm_plex_simplex 0 [50]PETSC ERROR: -dm_refine 4 [50]PETSC ERROR: -dm_vec_type kokkos [50]PETSC ERROR: -dm_view [50]PETSC ERROR: -ksp_converged_reason [50]PETSC ERROR: -ksp_max_it 200 [50]PETSC ERROR: -ksp_norm_type unpreconditioned [50]PETSC ERROR: -ksp_rtol 1.e-12 [50]PETSC ERROR: -ksp_type cg [50]PETSC ERROR: -log_view [50]PETSC ERROR: -mat_type aijkokkos [50]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [50]PETSC ERROR: -mg_levels_ksp_type chebyshev [50]PETSC ERROR: -mg_levels_pc_type jacobi [50]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [50]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [50]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact MPICH ERROR [Rank 51] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001128] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [48]PETSC ERROR: -dm_distribute [48]PETSC ERROR: -dm_mat_type aijkokkos [48]PETSC ERROR: -dm_plex_box_faces 4,4,4 [48]PETSC ERROR: -dm_plex_box_lower 0,0,0 [48]PETSC ERROR: -dm_plex_box_upper 2,2,2 [48]PETSC ERROR: -dm_plex_dim 3 [48]PETSC ERROR: -dm_plex_simplex 0 [48]PETSC ERROR: -dm_refine 4 [48]PETSC ERROR: -dm_vec_type kokkos [48]PETSC ERROR: -dm_view [48]PETSC ERROR: -ksp_converged_reason [48]PETSC ERROR: -ksp_max_it 200 [48]PETSC ERROR: -ksp_norm_type unpreconditioned [48]PETSC ERROR: -ksp_rtol 1.e-12 [48]PETSC ERROR: -ksp_type cg [48]PETSC ERROR: -log_view [48]PETSC ERROR: -mat_type aijkokkos [48]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [48]PETSC ERROR: -mg_levels_ksp_type chebyshev [48]PETSC ERROR: -mg_levels_pc_type jacobi [48]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [48]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [48]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [48]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [48]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [16]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001120 by madams Sun Oct 30 05:30:05 2022 [16]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [16]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [17]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [17]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [17]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [17]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [17]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [17]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [17]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [17]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [17]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [17]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [33]PETSC ERROR: -snes_lag_jacobian -2 [33]PETSC ERROR: -snes_max_it 1 [33]PETSC ERROR: -snes_rtol 1.e-8 [33]PETSC ERROR: -snes_type ksponly [33]PETSC ERROR: -use_gpu_aware_mpi 0 [33]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 33] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001124] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [34]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [34]PETSC ERROR: No support for this operation for this object type [34]PETSC ERROR: No method productsymbolic for Mat of type (null) [34]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [34]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [34]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [34]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [34]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [34]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [34]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [34]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001124 by madams Sun Oct 30 05:30:05 2022 [34]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [34]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [34]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [34]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [34]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [34]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [34]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [34]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [34]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [34]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [34]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [34]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [34]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [34]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [34]PETSC ERROR: #14 main() at ex13.c:178 [34]PETSC ERROR: PETSc Option Table entries: [34]PETSC ERROR: -benchmark_it 10 [34]PETSC ERROR: -dm_distribute [34]PETSC ERROR: -dm_mat_type aijkokkos [34]PETSC ERROR: -dm_plex_box_faces 4,4,4 [34]PETSC ERROR: -dm_plex_box_lower 0,0,0 [34]PETSC ERROR: -dm_plex_box_upper 2,2,2 [34]PETSC ERROR: -dm_plex_dim 3 [34]PETSC ERROR: -dm_plex_simplex 0 [34]PETSC ERROR: -dm_refine 4 [34]PETSC ERROR: -dm_vec_type kokkos [34]PETSC ERROR: -dm_view [34]PETSC ERROR: -ksp_converged_reason [34]PETSC ERROR: -ksp_max_it 200 [48]PETSC ERROR: -pc_gamg_process_eq_limit 400 [48]PETSC ERROR: -pc_gamg_repartition false [48]PETSC ERROR: -pc_gamg_reuse_interpolation true [48]PETSC ERROR: -pc_gamg_threshold 0.01 [48]PETSC ERROR: -pc_type gamg [48]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [48]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [48]PETSC ERROR: -petscpartitioner_type simple [48]PETSC ERROR: -potential_petscspace_degree 2 [44]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [44]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [44]PETSC ERROR: #14 main() at ex13.c:178 [44]PETSC ERROR: PETSc Option Table entries: [44]PETSC ERROR: -benchmark_it 10 [44]PETSC ERROR: -dm_distribute [44]PETSC ERROR: -dm_mat_type aijkokkos [44]PETSC ERROR: -dm_plex_box_faces 4,4,4 [44]PETSC ERROR: -dm_plex_box_lower 0,0,0 [44]PETSC ERROR: -dm_plex_box_upper 2,2,2 [44]PETSC ERROR: -dm_plex_dim 3 [44]PETSC ERROR: -dm_plex_simplex 0 [44]PETSC ERROR: -dm_refine 4 [44]PETSC ERROR: -dm_vec_type kokkos [44]PETSC ERROR: -dm_view [44]PETSC ERROR: -ksp_converged_reason [44]PETSC ERROR: -ksp_max_it 200 [44]PETSC ERROR: -ksp_norm_type unpreconditioned [44]PETSC ERROR: -ksp_rtol 1.e-12 [44]PETSC ERROR: -ksp_type cg [44]PETSC ERROR: -log_view [44]PETSC ERROR: -mat_type aijkokkos [44]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [44]PETSC ERROR: -mg_levels_ksp_type chebyshev [44]PETSC ERROR: -mg_levels_pc_type jacobi [44]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [44]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [44]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [44]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [44]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [44]PETSC ERROR: -pc_gamg_process_eq_limit 400 [44]PETSC ERROR: -pc_gamg_repartition false [44]PETSC ERROR: -pc_gamg_reuse_interpolation true [44]PETSC ERROR: -pc_gamg_threshold 0.01 [44]PETSC ERROR: -pc_type gamg [44]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [44]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [44]PETSC ERROR: -petscpartitioner_type simple [44]PETSC ERROR: -potential_petscspace_degree 2 [44]PETSC ERROR: -snes_lag_jacobian -2 [44]PETSC ERROR: -snes_max_it 1 [44]PETSC ERROR: -snes_rtol 1.e-8 [44]PETSC ERROR: -snes_type ksponly [44]PETSC ERROR: -use_gpu_aware_mpi 0 [50]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [50]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [50]PETSC ERROR: -pc_gamg_process_eq_limit 400 [50]PETSC ERROR: -pc_gamg_repartition false [50]PETSC ERROR: -pc_gamg_reuse_interpolation true [50]PETSC ERROR: -pc_gamg_threshold 0.01 [50]PETSC ERROR: -pc_type gamg [50]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [50]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [50]PETSC ERROR: -petscpartitioner_type simple [50]PETSC ERROR: -potential_petscspace_degree 2 [50]PETSC ERROR: -snes_lag_jacobian -2 [50]PETSC ERROR: -snes_max_it 1 [50]PETSC ERROR: -snes_rtol 1.e-8 [50]PETSC ERROR: -snes_type ksponly [50]PETSC ERROR: -use_gpu_aware_mpi 0 [50]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 50] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001128] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [44]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 44] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001125] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [45]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [45]PETSC ERROR: No support for this operation for this object type [45]PETSC ERROR: No method productsymbolic for Mat of type (null) [45]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [45]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [45]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [45]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [45]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [45]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [45]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [48]PETSC ERROR: -snes_lag_jacobian -2 [48]PETSC ERROR: -snes_max_it 1 [48]PETSC ERROR: -snes_rtol 1.e-8 [48]PETSC ERROR: -snes_type ksponly [48]PETSC ERROR: -use_gpu_aware_mpi 0 [48]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 48] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001128] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [58]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [58]PETSC ERROR: No support for this operation for this object type [58]PETSC ERROR: No method productsymbolic for Mat of type (null) [58]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [58]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [58]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [58]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [58]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [58]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [58]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [58]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001129 by madams Sun Oct 30 05:30:05 2022 [58]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [58]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [58]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [58]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [58]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [58]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [58]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [58]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [58]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [58]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [58]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [58]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [58]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [58]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [58]PETSC ERROR: #14 main() at ex13.c:178 [58]PETSC ERROR: PETSc Option Table entries: [58]PETSC ERROR: -benchmark_it 10 [58]PETSC ERROR: -dm_distribute [58]PETSC ERROR: -dm_mat_type aijkokkos [58]PETSC ERROR: -dm_plex_box_faces 4,4,4 [58]PETSC ERROR: -dm_plex_box_lower 0,0,0 [58]PETSC ERROR: -dm_plex_box_upper 2,2,2 [58]PETSC ERROR: -dm_plex_dim 3 [58]PETSC ERROR: -dm_plex_simplex 0 [58]PETSC ERROR: -dm_refine 4 [58]PETSC ERROR: -dm_vec_type kokkos [58]PETSC ERROR: -dm_view [58]PETSC ERROR: -ksp_converged_reason [58]PETSC ERROR: -ksp_max_it 200 [58]PETSC ERROR: -ksp_norm_type unpreconditioned [58]PETSC ERROR: -ksp_rtol 1.e-12 [58]PETSC ERROR: -ksp_type cg [58]PETSC ERROR: -log_view [58]PETSC ERROR: -mat_type aijkokkos [58]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [58]PETSC ERROR: -mg_levels_ksp_type chebyshev [58]PETSC ERROR: -mg_levels_pc_type jacobi [58]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [58]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [58]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [58]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [59]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 59] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001129] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [60]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [60]PETSC ERROR: No support for this operation for this object type [60]PETSC ERROR: No method productsymbolic for Mat of type (null) [60]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [60]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [60]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [60]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [60]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [60]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [60]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [60]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001129 by madams Sun Oct 30 05:30:05 2022 [60]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [60]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [60]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [60]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [60]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [60]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [60]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [60]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [60]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [60]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [60]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [60]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [60]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [60]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [60]PETSC ERROR: #14 main() at ex13.c:178 [60]PETSC ERROR: PETSc Option Table entries: [60]PETSC ERROR: -benchmark_it 10 [60]PETSC ERROR: -dm_distribute [60]PETSC ERROR: -dm_mat_type aijkokkos [60]PETSC ERROR: -dm_plex_box_faces 4,4,4 [60]PETSC ERROR: -dm_plex_box_lower 0,0,0 [60]PETSC ERROR: -dm_plex_box_upper 2,2,2 [60]PETSC ERROR: -dm_plex_dim 3 [60]PETSC ERROR: -dm_plex_simplex 0 [60]PETSC ERROR: -dm_refine 4 [60]PETSC ERROR: -dm_vec_type kokkos [60]PETSC ERROR: -dm_view [60]PETSC ERROR: -ksp_converged_reason [60]PETSC ERROR: -ksp_max_it 200 [60]PETSC ERROR: -ksp_norm_type unpreconditioned [60]PETSC ERROR: -ksp_rtol 1.e-12 [60]PETSC ERROR: -ksp_type cg [60]PETSC ERROR: -log_view [60]PETSC ERROR: -mat_type aijkokkos [60]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [60]PETSC ERROR: -mg_levels_ksp_type chebyshev [60]PETSC ERROR: -mg_levels_pc_type jacobi [60]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [60]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [60]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [60]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 MPICH ERROR [Rank 61] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001129] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [56]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [56]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [56]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [56]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [56]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [56]PETSC ERROR: #14 main() at ex13.c:178 [56]PETSC ERROR: PETSc Option Table entries: [56]PETSC ERROR: -benchmark_it 10 [56]PETSC ERROR: -dm_distribute [56]PETSC ERROR: -dm_mat_type aijkokkos [56]PETSC ERROR: -dm_plex_box_faces 4,4,4 [56]PETSC ERROR: -dm_plex_box_lower 0,0,0 [56]PETSC ERROR: -dm_plex_box_upper 2,2,2 [56]PETSC ERROR: -dm_plex_dim 3 [56]PETSC ERROR: -dm_plex_simplex 0 [56]PETSC ERROR: -dm_refine 4 [56]PETSC ERROR: -dm_vec_type kokkos [56]PETSC ERROR: -dm_view [56]PETSC ERROR: -ksp_converged_reason [56]PETSC ERROR: -ksp_max_it 200 [56]PETSC ERROR: -ksp_norm_type unpreconditioned [56]PETSC ERROR: -ksp_rtol 1.e-12 [56]PETSC ERROR: -ksp_type cg [56]PETSC ERROR: -log_view [56]PETSC ERROR: -mat_type aijkokkos [56]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [56]PETSC ERROR: -mg_levels_ksp_type chebyshev [56]PETSC ERROR: -mg_levels_pc_type jacobi [56]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [56]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [56]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [56]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [56]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [56]PETSC ERROR: -pc_gamg_process_eq_limit 400 [56]PETSC ERROR: -pc_gamg_repartition false [56]PETSC ERROR: -pc_gamg_reuse_interpolation true [56]PETSC ERROR: -pc_gamg_threshold 0.01 [56]PETSC ERROR: -pc_type gamg [56]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [56]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [56]PETSC ERROR: -petscpartitioner_type simple [56]PETSC ERROR: -potential_petscspace_degree 2 [56]PETSC ERROR: -snes_lag_jacobian -2 [56]PETSC ERROR: -snes_max_it 1 [56]PETSC ERROR: -snes_rtol 1.e-8 [56]PETSC ERROR: -snes_type ksponly [56]PETSC ERROR: -use_gpu_aware_mpi 0 [56]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 56] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001129] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [17]PETSC ERROR: #14 main() at ex13.c:178 [17]PETSC ERROR: PETSc Option Table entries: [17]PETSC ERROR: -benchmark_it 10 [17]PETSC ERROR: -dm_distribute [17]PETSC ERROR: -dm_mat_type aijkokkos [17]PETSC ERROR: -dm_plex_box_faces 4,4,4 [17]PETSC ERROR: -dm_plex_box_lower 0,0,0 [17]PETSC ERROR: -dm_plex_box_upper 2,2,2 [17]PETSC ERROR: -dm_plex_dim 3 [17]PETSC ERROR: -dm_plex_simplex 0 [17]PETSC ERROR: -dm_refine 4 [17]PETSC ERROR: -dm_vec_type kokkos [17]PETSC ERROR: -dm_view [17]PETSC ERROR: -ksp_converged_reason [17]PETSC ERROR: -ksp_max_it 200 [17]PETSC ERROR: -ksp_norm_type unpreconditioned [17]PETSC ERROR: -ksp_rtol 1.e-12 [17]PETSC ERROR: -ksp_type cg [17]PETSC ERROR: -log_view [17]PETSC ERROR: -mat_type aijkokkos [17]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [17]PETSC ERROR: -mg_levels_ksp_type chebyshev [17]PETSC ERROR: -mg_levels_pc_type jacobi [17]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [17]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [28]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [28]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [28]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [28]PETSC ERROR: #14 main() at ex13.c:178 [28]PETSC ERROR: PETSc Option Table entries: [28]PETSC ERROR: -benchmark_it 10 [28]PETSC ERROR: -dm_distribute [28]PETSC ERROR: -dm_mat_type aijkokkos [28]PETSC ERROR: -dm_plex_box_faces 4,4,4 [28]PETSC ERROR: -dm_plex_box_lower 0,0,0 [28]PETSC ERROR: -dm_plex_box_upper 2,2,2 [28]PETSC ERROR: -dm_plex_dim 3 [28]PETSC ERROR: -dm_plex_simplex 0 [28]PETSC ERROR: -dm_refine 4 [28]PETSC ERROR: -dm_vec_type kokkos [28]PETSC ERROR: -dm_view [28]PETSC ERROR: -ksp_converged_reason [28]PETSC ERROR: -ksp_max_it 200 [28]PETSC ERROR: -ksp_norm_type unpreconditioned [28]PETSC ERROR: -ksp_rtol 1.e-12 [28]PETSC ERROR: -ksp_type cg [28]PETSC ERROR: -log_view [45]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001125 by madams Sun Oct 30 05:30:05 2022 [45]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [45]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [28]PETSC ERROR: -mat_type aijkokkos [28]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [28]PETSC ERROR: -mg_levels_ksp_type chebyshev [28]PETSC ERROR: -mg_levels_pc_type jacobi [28]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [28]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [28]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [28]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [45]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [45]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [45]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [45]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [45]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [45]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [45]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [45]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [45]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [30]PETSC ERROR: -dm_distribute [30]PETSC ERROR: -dm_mat_type aijkokkos [30]PETSC ERROR: -dm_plex_box_faces 4,4,4 [30]PETSC ERROR: -dm_plex_box_lower 0,0,0 [30]PETSC ERROR: -dm_plex_box_upper 2,2,2 [30]PETSC ERROR: -dm_plex_dim 3 [30]PETSC ERROR: -dm_plex_simplex 0 [30]PETSC ERROR: -dm_refine 4 [30]PETSC ERROR: -dm_vec_type kokkos [30]PETSC ERROR: -dm_view [30]PETSC ERROR: -ksp_converged_reason [30]PETSC ERROR: -ksp_max_it 200 [30]PETSC ERROR: -ksp_norm_type unpreconditioned [30]PETSC ERROR: -ksp_rtol 1.e-12 [30]PETSC ERROR: -ksp_type cg [30]PETSC ERROR: -log_view [30]PETSC ERROR: -mat_type aijkokkos [30]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [30]PETSC ERROR: -mg_levels_ksp_type chebyshev [30]PETSC ERROR: -mg_levels_pc_type jacobi [30]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [30]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [30]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [30]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [30]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [45]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [45]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [45]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [45]PETSC ERROR: #14 main() at ex13.c:178 [45]PETSC ERROR: PETSc Option Table entries: [45]PETSC ERROR: -benchmark_it 10 [45]PETSC ERROR: -dm_distribute [45]PETSC ERROR: -dm_mat_type aijkokkos [45]PETSC ERROR: -dm_plex_box_faces 4,4,4 [45]PETSC ERROR: -dm_plex_box_lower 0,0,0 [45]PETSC ERROR: -dm_plex_box_upper 2,2,2 [45]PETSC ERROR: -dm_plex_dim 3 [45]PETSC ERROR: -dm_plex_simplex 0 [45]PETSC ERROR: -dm_refine 4 [45]PETSC ERROR: -dm_vec_type kokkos [45]PETSC ERROR: -dm_view [45]PETSC ERROR: -ksp_converged_reason [45]PETSC ERROR: -ksp_max_it 200 [45]PETSC ERROR: -ksp_norm_type unpreconditioned [45]PETSC ERROR: -ksp_rtol 1.e-12 [45]PETSC ERROR: -ksp_type cg [45]PETSC ERROR: -log_view [30]PETSC ERROR: -pc_gamg_process_eq_limit 400 [30]PETSC ERROR: -pc_gamg_repartition false [30]PETSC ERROR: -pc_gamg_reuse_interpolation true [30]PETSC ERROR: -pc_gamg_threshold 0.01 [30]PETSC ERROR: -pc_type gamg [30]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [30]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [30]PETSC ERROR: -petscpartitioner_type simple [30]PETSC ERROR: -potential_petscspace_degree 2 [30]PETSC ERROR: -snes_lag_jacobian -2 [30]PETSC ERROR: -snes_max_it 1 [30]PETSC ERROR: -snes_rtol 1.e-8 [30]PETSC ERROR: -snes_type ksponly [30]PETSC ERROR: -use_gpu_aware_mpi 0 [30]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 30] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001121] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [45]PETSC ERROR: -mat_type aijkokkos [45]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [45]PETSC ERROR: -mg_levels_ksp_type chebyshev [45]PETSC ERROR: -mg_levels_pc_type jacobi [45]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [45]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [45]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [45]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [31]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [31]PETSC ERROR: No support for this operation for this object type [31]PETSC ERROR: No method productsymbolic for Mat of type (null) [31]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [31]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [31]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [31]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [31]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [31]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [31]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [46]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [46]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [46]PETSC ERROR: #14 main() at ex13.c:178 [46]PETSC ERROR: PETSc Option Table entries: [46]PETSC ERROR: -benchmark_it 10 [46]PETSC ERROR: -dm_distribute [46]PETSC ERROR: -dm_mat_type aijkokkos [46]PETSC ERROR: -dm_plex_box_faces 4,4,4 [46]PETSC ERROR: -dm_plex_box_lower 0,0,0 [46]PETSC ERROR: -dm_plex_box_upper 2,2,2 [46]PETSC ERROR: -dm_plex_dim 3 [46]PETSC ERROR: -dm_plex_simplex 0 [46]PETSC ERROR: -dm_refine 4 [46]PETSC ERROR: -dm_vec_type kokkos [46]PETSC ERROR: -dm_view [46]PETSC ERROR: -ksp_converged_reason [46]PETSC ERROR: -ksp_max_it 200 [46]PETSC ERROR: -ksp_norm_type unpreconditioned [46]PETSC ERROR: -ksp_rtol 1.e-12 [46]PETSC ERROR: -ksp_type cg [46]PETSC ERROR: -log_view [46]PETSC ERROR: -mat_type aijkokkos [46]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [17]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [17]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [17]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [17]PETSC ERROR: -pc_gamg_process_eq_limit 400 [17]PETSC ERROR: -pc_gamg_repartition false [17]PETSC ERROR: -pc_gamg_reuse_interpolation true [17]PETSC ERROR: -pc_gamg_threshold 0.01 [17]PETSC ERROR: -pc_type gamg [17]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [17]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [17]PETSC ERROR: -petscpartitioner_type simple [17]PETSC ERROR: -potential_petscspace_degree 2 [17]PETSC ERROR: -snes_lag_jacobian -2 [17]PETSC ERROR: -snes_max_it 1 [17]PETSC ERROR: -snes_rtol 1.e-8 [17]PETSC ERROR: -snes_type ksponly [17]PETSC ERROR: -use_gpu_aware_mpi 0 [17]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- [31]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001121 by madams Sun Oct 30 05:30:05 2022 [31]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [31]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [46]PETSC ERROR: -mg_levels_ksp_type chebyshev [46]PETSC ERROR: -mg_levels_pc_type jacobi [46]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [46]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [46]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [46]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [46]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [46]PETSC ERROR: -pc_gamg_process_eq_limit 400 [46]PETSC ERROR: -pc_gamg_repartition false [46]PETSC ERROR: -pc_gamg_reuse_interpolation true [46]PETSC ERROR: -pc_gamg_threshold 0.01 [46]PETSC ERROR: -pc_type gamg [46]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [46]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [46]PETSC ERROR: -petscpartitioner_type simple [46]PETSC ERROR: -potential_petscspace_degree 2 [46]PETSC ERROR: -snes_lag_jacobian -2 [46]PETSC ERROR: -snes_max_it 1 [46]PETSC ERROR: -snes_rtol 1.e-8 [46]PETSC ERROR: -snes_type ksponly [46]PETSC ERROR: -use_gpu_aware_mpi 0 MPICH ERROR [Rank 17] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001120] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [46]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 46] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001125] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [18]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [18]PETSC ERROR: No support for this operation for this object type [18]PETSC ERROR: No method productsymbolic for Mat of type (null) [18]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [18]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [18]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [18]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [18]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [18]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [18]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [47]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [47]PETSC ERROR: No support for this operation for this object type [47]PETSC ERROR: No method productsymbolic for Mat of type (null) [47]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [47]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [47]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [47]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [47]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [47]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [47]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [18]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001120 by madams Sun Oct 30 05:30:05 2022 [18]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [18]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [47]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001125 by madams Sun Oct 30 05:30:05 2022 [47]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [47]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [18]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [18]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [18]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [18]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [18]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [18]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [18]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [18]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [18]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [18]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [18]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [18]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [18]PETSC ERROR: #14 main() at ex13.c:178 [18]PETSC ERROR: PETSc Option Table entries: [18]PETSC ERROR: -benchmark_it 10 [18]PETSC ERROR: -dm_distribute [18]PETSC ERROR: -dm_mat_type aijkokkos [18]PETSC ERROR: -dm_plex_box_faces 4,4,4 [18]PETSC ERROR: -dm_plex_box_lower 0,0,0 [18]PETSC ERROR: -dm_plex_box_upper 2,2,2 [18]PETSC ERROR: -dm_plex_dim 3 [18]PETSC ERROR: -dm_plex_simplex 0 [18]PETSC ERROR: -dm_refine 4 [18]PETSC ERROR: -dm_vec_type kokkos [19]PETSC ERROR: PETSc Option Table entries: [19]PETSC ERROR: -benchmark_it 10 [19]PETSC ERROR: -dm_distribute [19]PETSC ERROR: -dm_mat_type aijkokkos [19]PETSC ERROR: -dm_plex_box_faces 4,4,4 [19]PETSC ERROR: -dm_plex_box_lower 0,0,0 [19]PETSC ERROR: -dm_plex_box_upper 2,2,2 [19]PETSC ERROR: -dm_plex_dim 3 [19]PETSC ERROR: -dm_plex_simplex 0 [19]PETSC ERROR: -dm_refine 4 [19]PETSC ERROR: -dm_vec_type kokkos [19]PETSC ERROR: -dm_view [19]PETSC ERROR: -ksp_converged_reason [19]PETSC ERROR: -ksp_max_it 200 [19]PETSC ERROR: -ksp_norm_type unpreconditioned [19]PETSC ERROR: -ksp_rtol 1.e-12 [19]PETSC ERROR: -ksp_type cg [19]PETSC ERROR: -log_view [19]PETSC ERROR: -mat_type aijkokkos [19]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [19]PETSC ERROR: -mg_levels_ksp_type chebyshev [19]PETSC ERROR: -mg_levels_pc_type jacobi [19]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [19]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [19]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [19]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [19]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [19]PETSC ERROR: -pc_gamg_process_eq_limit 400 [19]PETSC ERROR: -pc_gamg_repartition false [19]PETSC ERROR: -pc_gamg_reuse_interpolation true [19]PETSC ERROR: -pc_gamg_threshold 0.01 [19]PETSC ERROR: -pc_type gamg [19]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [19]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [19]PETSC ERROR: -petscpartitioner_type simple [19]PETSC ERROR: -potential_petscspace_degree 2 [19]PETSC ERROR: -snes_lag_jacobian -2 [19]PETSC ERROR: -snes_max_it 1 [19]PETSC ERROR: -snes_rtol 1.e-8 [19]PETSC ERROR: -snes_type ksponly [19]PETSC ERROR: -use_gpu_aware_mpi 0 [19]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 19] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001120] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [31]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [31]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [31]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [31]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [31]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [31]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [31]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [31]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [31]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [47]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [47]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [47]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [47]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [47]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [47]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [47]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [47]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [47]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [31]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [31]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [31]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [31]PETSC ERROR: #14 main() at ex13.c:178 [31]PETSC ERROR: PETSc Option Table entries: [31]PETSC ERROR: -benchmark_it 10 [31]PETSC ERROR: -dm_distribute [31]PETSC ERROR: -dm_mat_type aijkokkos [31]PETSC ERROR: -dm_plex_box_faces 4,4,4 [31]PETSC ERROR: -dm_plex_box_lower 0,0,0 [31]PETSC ERROR: -dm_plex_box_upper 2,2,2 [31]PETSC ERROR: -dm_plex_dim 3 [31]PETSC ERROR: -dm_plex_simplex 0 [31]PETSC ERROR: -dm_refine 4 [31]PETSC ERROR: -dm_vec_type kokkos [31]PETSC ERROR: -dm_view [31]PETSC ERROR: -ksp_converged_reason [31]PETSC ERROR: -ksp_max_it 200 [31]PETSC ERROR: -ksp_norm_type unpreconditioned [31]PETSC ERROR: -ksp_rtol 1.e-12 [31]PETSC ERROR: -ksp_type cg [31]PETSC ERROR: -log_view [47]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [47]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [47]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [47]PETSC ERROR: #14 main() at ex13.c:178 [47]PETSC ERROR: PETSc Option Table entries: [47]PETSC ERROR: -benchmark_it 10 [47]PETSC ERROR: -dm_distribute [47]PETSC ERROR: -dm_mat_type aijkokkos [47]PETSC ERROR: -dm_plex_box_faces 4,4,4 [47]PETSC ERROR: -dm_plex_box_lower 0,0,0 [47]PETSC ERROR: -dm_plex_box_upper 2,2,2 [47]PETSC ERROR: -dm_plex_dim 3 [47]PETSC ERROR: -dm_plex_simplex 0 [47]PETSC ERROR: -dm_refine 4 [47]PETSC ERROR: -dm_vec_type kokkos [47]PETSC ERROR: -dm_view [47]PETSC ERROR: -ksp_converged_reason [47]PETSC ERROR: -ksp_max_it 200 [47]PETSC ERROR: -ksp_norm_type unpreconditioned [47]PETSC ERROR: -ksp_rtol 1.e-12 [47]PETSC ERROR: -ksp_type cg [47]PETSC ERROR: -log_view [47]PETSC ERROR: -mat_type aijkokkos [47]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [47]PETSC ERROR: -mg_levels_ksp_type chebyshev [47]PETSC ERROR: -mg_levels_pc_type jacobi [47]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [47]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [47]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [47]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [40]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [40]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [40]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [40]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [40]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [40]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [40]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [40]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [40]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [40]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [40]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [40]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [40]PETSC ERROR: #14 main() at ex13.c:178 [40]PETSC ERROR: PETSc Option Table entries: [40]PETSC ERROR: -benchmark_it 10 [40]PETSC ERROR: -dm_distribute [40]PETSC ERROR: -dm_mat_type aijkokkos [40]PETSC ERROR: -dm_plex_box_faces 4,4,4 [40]PETSC ERROR: -dm_plex_box_lower 0,0,0 [40]PETSC ERROR: -dm_plex_box_upper 2,2,2 [40]PETSC ERROR: -dm_plex_dim 3 [40]PETSC ERROR: -dm_plex_simplex 0 [40]PETSC ERROR: -dm_refine 4 [40]PETSC ERROR: -dm_vec_type kokkos [40]PETSC ERROR: -dm_view [40]PETSC ERROR: -ksp_converged_reason [40]PETSC ERROR: -ksp_max_it 200 [40]PETSC ERROR: -ksp_norm_type unpreconditioned [40]PETSC ERROR: -ksp_rtol 1.e-12 [40]PETSC ERROR: -ksp_type cg [40]PETSC ERROR: -log_view [40]PETSC ERROR: -mat_type aijkokkos [40]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [40]PETSC ERROR: -mg_levels_ksp_type chebyshev [40]PETSC ERROR: -mg_levels_pc_type jacobi [40]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [40]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [40]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [40]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [40]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [40]PETSC ERROR: -pc_gamg_process_eq_limit 400 [40]PETSC ERROR: -pc_gamg_repartition false [40]PETSC ERROR: -pc_gamg_reuse_interpolation true [40]PETSC ERROR: -pc_gamg_threshold 0.01 [40]PETSC ERROR: -pc_type gamg [40]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [40]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [40]PETSC ERROR: -petscpartitioner_type simple [40]PETSC ERROR: -potential_petscspace_degree 2 [40]PETSC ERROR: -snes_lag_jacobian -2 [40]PETSC ERROR: -snes_max_it 1 [40]PETSC ERROR: -snes_rtol 1.e-8 [40]PETSC ERROR: -snes_type ksponly [40]PETSC ERROR: -use_gpu_aware_mpi 0 [40]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 40] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001125] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [58]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [58]PETSC ERROR: -pc_gamg_process_eq_limit 400 [58]PETSC ERROR: -pc_gamg_repartition false [58]PETSC ERROR: -pc_gamg_reuse_interpolation true [58]PETSC ERROR: -pc_gamg_threshold 0.01 [58]PETSC ERROR: -pc_type gamg [58]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [58]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [58]PETSC ERROR: -petscpartitioner_type simple [58]PETSC ERROR: -potential_petscspace_degree 2 [58]PETSC ERROR: -snes_lag_jacobian -2 [58]PETSC ERROR: -snes_max_it 1 [58]PETSC ERROR: -snes_rtol 1.e-8 [58]PETSC ERROR: -snes_type ksponly [58]PETSC ERROR: -use_gpu_aware_mpi 0 [58]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 58] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001129] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [60]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [60]PETSC ERROR: -pc_gamg_process_eq_limit 400 [60]PETSC ERROR: -pc_gamg_repartition false [60]PETSC ERROR: -pc_gamg_reuse_interpolation true [60]PETSC ERROR: -pc_gamg_threshold 0.01 [60]PETSC ERROR: -pc_type gamg [60]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [60]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [60]PETSC ERROR: -petscpartitioner_type simple [60]PETSC ERROR: -potential_petscspace_degree 2 [60]PETSC ERROR: -snes_lag_jacobian -2 [60]PETSC ERROR: -snes_max_it 1 [60]PETSC ERROR: -snes_rtol 1.e-8 [60]PETSC ERROR: -snes_type ksponly [60]PETSC ERROR: -use_gpu_aware_mpi 0 [60]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 60] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001129] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [20]PETSC ERROR: -ksp_max_it 200 [20]PETSC ERROR: -ksp_norm_type unpreconditioned [20]PETSC ERROR: -ksp_rtol 1.e-12 [20]PETSC ERROR: -ksp_type cg [20]PETSC ERROR: -log_view [20]PETSC ERROR: -mat_type aijkokkos [20]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [20]PETSC ERROR: -mg_levels_ksp_type chebyshev [20]PETSC ERROR: -mg_levels_pc_type jacobi [20]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [20]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [20]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [20]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [20]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [20]PETSC ERROR: -pc_gamg_process_eq_limit 400 [20]PETSC ERROR: -pc_gamg_repartition false [20]PETSC ERROR: -pc_gamg_reuse_interpolation true [20]PETSC ERROR: -pc_gamg_threshold 0.01 [20]PETSC ERROR: -pc_type gamg [20]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [20]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [20]PETSC ERROR: -petscpartitioner_type simple [41]PETSC ERROR: -petscpartitioner_type simple [41]PETSC ERROR: -potential_petscspace_degree 2 [41]PETSC ERROR: -snes_lag_jacobian -2 [41]PETSC ERROR: -snes_max_it 1 [41]PETSC ERROR: -snes_rtol 1.e-8 [41]PETSC ERROR: -snes_type ksponly [41]PETSC ERROR: -use_gpu_aware_mpi 0 [41]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 41] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001125] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [20]PETSC ERROR: -potential_petscspace_degree 2 [20]PETSC ERROR: -snes_lag_jacobian -2 [20]PETSC ERROR: -snes_max_it 1 [20]PETSC ERROR: -snes_rtol 1.e-8 [20]PETSC ERROR: -snes_type ksponly [20]PETSC ERROR: -use_gpu_aware_mpi 0 [20]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 20] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001120] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [42]PETSC ERROR: -dm_view [42]PETSC ERROR: -ksp_converged_reason [42]PETSC ERROR: -ksp_max_it 200 [42]PETSC ERROR: -ksp_norm_type unpreconditioned [42]PETSC ERROR: -ksp_rtol 1.e-12 [42]PETSC ERROR: -ksp_type cg [42]PETSC ERROR: -log_view [42]PETSC ERROR: -mat_type aijkokkos [42]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [42]PETSC ERROR: -mg_levels_ksp_type chebyshev [42]PETSC ERROR: -mg_levels_pc_type jacobi [42]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [42]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [42]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [42]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [42]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [42]PETSC ERROR: -pc_gamg_process_eq_limit 400 [42]PETSC ERROR: -pc_gamg_repartition false [42]PETSC ERROR: -pc_gamg_reuse_interpolation true [42]PETSC ERROR: -pc_gamg_threshold 0.01 [42]PETSC ERROR: -pc_type gamg [42]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [42]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [21]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [21]PETSC ERROR: No support for this operation for this object type [21]PETSC ERROR: No method productsymbolic for Mat of type (null) [21]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [21]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [21]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [21]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [21]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [21]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [21]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [35]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [35]PETSC ERROR: -pc_gamg_process_eq_limit 400 [35]PETSC ERROR: -pc_gamg_repartition false [35]PETSC ERROR: -pc_gamg_reuse_interpolation true [35]PETSC ERROR: -pc_gamg_threshold 0.01 [35]PETSC ERROR: -pc_type gamg [35]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [35]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [35]PETSC ERROR: -petscpartitioner_type simple [35]PETSC ERROR: -potential_petscspace_degree 2 [35]PETSC ERROR: -snes_lag_jacobian -2 [35]PETSC ERROR: -snes_max_it 1 [35]PETSC ERROR: -snes_rtol 1.e-8 [35]PETSC ERROR: -snes_type ksponly [35]PETSC ERROR: -use_gpu_aware_mpi 0 [35]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 35] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001124] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [21]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001120 by madams Sun Oct 30 05:30:05 2022 [21]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [21]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [36]PETSC ERROR: -snes_type ksponly [36]PETSC ERROR: -use_gpu_aware_mpi 0 [36]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 36] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001124] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [21]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [21]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [21]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [21]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [21]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [21]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [21]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [21]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [21]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [37]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [37]PETSC ERROR: No support for this operation for this object type [37]PETSC ERROR: No method productsymbolic for Mat of type (null) [37]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [37]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [37]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [37]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [37]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [37]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [37]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 [21]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [21]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [21]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [21]PETSC ERROR: #14 main() at ex13.c:178 [21]PETSC ERROR: PETSc Option Table entries: [21]PETSC ERROR: -benchmark_it 10 [21]PETSC ERROR: -dm_distribute [21]PETSC ERROR: -dm_mat_type aijkokkos [21]PETSC ERROR: -dm_plex_box_faces 4,4,4 [21]PETSC ERROR: -dm_plex_box_lower 0,0,0 [21]PETSC ERROR: -dm_plex_box_upper 2,2,2 [21]PETSC ERROR: -dm_plex_dim 3 [21]PETSC ERROR: -dm_plex_simplex 0 [21]PETSC ERROR: -dm_refine 4 [21]PETSC ERROR: -dm_vec_type kokkos [21]PETSC ERROR: -dm_view [21]PETSC ERROR: -ksp_converged_reason [21]PETSC ERROR: -ksp_max_it 200 [21]PETSC ERROR: -ksp_norm_type unpreconditioned [21]PETSC ERROR: -ksp_rtol 1.e-12 [21]PETSC ERROR: -ksp_type cg [21]PETSC ERROR: -log_view [37]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001124 by madams Sun Oct 30 05:30:05 2022 [37]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [37]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [21]PETSC ERROR: -mat_type aijkokkos [21]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [21]PETSC ERROR: -mg_levels_ksp_type chebyshev [21]PETSC ERROR: -mg_levels_pc_type jacobi [21]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [21]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [21]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [21]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [37]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [37]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [37]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [37]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [37]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [37]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [37]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [37]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [37]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [37]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [37]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [37]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [37]PETSC ERROR: #14 main() at ex13.c:178 [37]PETSC ERROR: PETSc Option Table entries: [37]PETSC ERROR: -benchmark_it 10 [37]PETSC ERROR: -dm_distribute [37]PETSC ERROR: -dm_mat_type aijkokkos [37]PETSC ERROR: -dm_plex_box_faces 4,4,4 [37]PETSC ERROR: -dm_plex_box_lower 0,0,0 [37]PETSC ERROR: -dm_plex_box_upper 2,2,2 [37]PETSC ERROR: -dm_plex_dim 3 [37]PETSC ERROR: -dm_plex_simplex 0 [37]PETSC ERROR: -dm_refine 4 [37]PETSC ERROR: -dm_vec_type kokkos [37]PETSC ERROR: -dm_view [37]PETSC ERROR: -ksp_converged_reason [37]PETSC ERROR: -ksp_max_it 200 [37]PETSC ERROR: -ksp_norm_type unpreconditioned [37]PETSC ERROR: -ksp_rtol 1.e-12 [37]PETSC ERROR: -ksp_type cg [37]PETSC ERROR: -log_view [37]PETSC ERROR: -mat_type aijkokkos [37]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [37]PETSC ERROR: -mg_levels_ksp_type chebyshev [37]PETSC ERROR: -mg_levels_pc_type jacobi [37]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [37]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [37]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [37]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [32]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [32]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [32]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [32]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [32]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [32]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [32]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [32]PETSC ERROR: #14 main() at ex13.c:178 [32]PETSC ERROR: PETSc Option Table entries: [32]PETSC ERROR: -benchmark_it 10 [32]PETSC ERROR: -dm_distribute [32]PETSC ERROR: -dm_mat_type aijkokkos [32]PETSC ERROR: -dm_plex_box_faces 4,4,4 [32]PETSC ERROR: -dm_plex_box_lower 0,0,0 [32]PETSC ERROR: -dm_plex_box_upper 2,2,2 [32]PETSC ERROR: -dm_plex_dim 3 [32]PETSC ERROR: -dm_plex_simplex 0 [32]PETSC ERROR: -dm_refine 4 [32]PETSC ERROR: -dm_vec_type kokkos [32]PETSC ERROR: -dm_view [32]PETSC ERROR: -ksp_converged_reason [32]PETSC ERROR: -ksp_max_it 200 [32]PETSC ERROR: -ksp_norm_type unpreconditioned [32]PETSC ERROR: -ksp_rtol 1.e-12 [32]PETSC ERROR: -ksp_type cg [32]PETSC ERROR: -log_view [32]PETSC ERROR: -mat_type aijkokkos [32]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [32]PETSC ERROR: -mg_levels_ksp_type chebyshev [32]PETSC ERROR: -mg_levels_pc_type jacobi [32]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [32]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [32]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [32]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [32]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [32]PETSC ERROR: -pc_gamg_process_eq_limit 400 [32]PETSC ERROR: -pc_gamg_repartition false [32]PETSC ERROR: -pc_gamg_reuse_interpolation true [32]PETSC ERROR: -pc_gamg_threshold 0.01 [32]PETSC ERROR: -pc_type gamg [32]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [32]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [32]PETSC ERROR: -petscpartitioner_type simple [32]PETSC ERROR: -potential_petscspace_degree 2 [32]PETSC ERROR: -snes_lag_jacobian -2 [32]PETSC ERROR: -snes_max_it 1 [32]PETSC ERROR: -snes_rtol 1.e-8 [32]PETSC ERROR: -snes_type ksponly [32]PETSC ERROR: -use_gpu_aware_mpi 0 [32]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 32] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001124] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [34]PETSC ERROR: -ksp_norm_type unpreconditioned [34]PETSC ERROR: -ksp_rtol 1.e-12 [34]PETSC ERROR: -ksp_type cg [34]PETSC ERROR: -log_view [34]PETSC ERROR: -mat_type aijkokkos [34]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [34]PETSC ERROR: -mg_levels_ksp_type chebyshev [34]PETSC ERROR: -mg_levels_pc_type jacobi [34]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [34]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [34]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [34]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [34]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [34]PETSC ERROR: -pc_gamg_process_eq_limit 400 [34]PETSC ERROR: -pc_gamg_repartition false [34]PETSC ERROR: -pc_gamg_reuse_interpolation true [34]PETSC ERROR: -pc_gamg_threshold 0.01 [34]PETSC ERROR: -pc_type gamg [34]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [34]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [34]PETSC ERROR: -petscpartitioner_type simple [34]PETSC ERROR: -potential_petscspace_degree 2 [34]PETSC ERROR: -snes_lag_jacobian -2 [34]PETSC ERROR: -snes_max_it 1 [34]PETSC ERROR: -snes_rtol 1.e-8 [34]PETSC ERROR: -snes_type ksponly [34]PETSC ERROR: -use_gpu_aware_mpi 0 [34]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 34] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001124] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 MPICH ERROR [Rank 23] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001120] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [16]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [16]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [16]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [16]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [16]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [16]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [16]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [16]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [16]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [16]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [16]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [16]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [16]PETSC ERROR: #14 main() at ex13.c:178 [16]PETSC ERROR: PETSc Option Table entries: [16]PETSC ERROR: -benchmark_it 10 [16]PETSC ERROR: -dm_distribute [16]PETSC ERROR: -dm_mat_type aijkokkos [16]PETSC ERROR: -dm_plex_box_faces 4,4,4 [16]PETSC ERROR: -dm_plex_box_lower 0,0,0 [16]PETSC ERROR: -dm_plex_box_upper 2,2,2 [16]PETSC ERROR: -dm_plex_dim 3 [16]PETSC ERROR: -dm_plex_simplex 0 [16]PETSC ERROR: -dm_refine 4 [16]PETSC ERROR: -dm_vec_type kokkos [16]PETSC ERROR: -dm_view [16]PETSC ERROR: -ksp_converged_reason [16]PETSC ERROR: -ksp_max_it 200 [16]PETSC ERROR: -ksp_norm_type unpreconditioned [16]PETSC ERROR: -ksp_rtol 1.e-12 [16]PETSC ERROR: -ksp_type cg [16]PETSC ERROR: -log_view [16]PETSC ERROR: -mat_type aijkokkos [16]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [16]PETSC ERROR: -mg_levels_ksp_type chebyshev [16]PETSC ERROR: -mg_levels_pc_type jacobi [16]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [16]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [16]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [16]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [16]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [16]PETSC ERROR: -pc_gamg_process_eq_limit 400 [16]PETSC ERROR: -pc_gamg_repartition false [16]PETSC ERROR: -pc_gamg_reuse_interpolation true [16]PETSC ERROR: -pc_gamg_threshold 0.01 [16]PETSC ERROR: -pc_type gamg [16]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [16]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [16]PETSC ERROR: -petscpartitioner_type simple [16]PETSC ERROR: -potential_petscspace_degree 2 [16]PETSC ERROR: -snes_lag_jacobian -2 [16]PETSC ERROR: -snes_max_it 1 [16]PETSC ERROR: -snes_rtol 1.e-8 [16]PETSC ERROR: -snes_type ksponly [31]PETSC ERROR: -mat_type aijkokkos [31]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [31]PETSC ERROR: -mg_levels_ksp_type chebyshev [31]PETSC ERROR: -mg_levels_pc_type jacobi [31]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [31]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [31]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [31]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [42]PETSC ERROR: -petscpartitioner_type simple [42]PETSC ERROR: -potential_petscspace_degree 2 [42]PETSC ERROR: -snes_lag_jacobian -2 [42]PETSC ERROR: -snes_max_it 1 [42]PETSC ERROR: -snes_rtol 1.e-8 [42]PETSC ERROR: -snes_type ksponly [42]PETSC ERROR: -use_gpu_aware_mpi 0 [42]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 42] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001125] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [16]PETSC ERROR: -use_gpu_aware_mpi 0 [16]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 16] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001120] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [24]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [24]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [24]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [24]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [24]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 [24]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [24]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [24]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [24]PETSC ERROR: #14 main() at ex13.c:178 [24]PETSC ERROR: PETSc Option Table entries: [24]PETSC ERROR: -benchmark_it 10 [24]PETSC ERROR: -dm_distribute [24]PETSC ERROR: -dm_mat_type aijkokkos Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [24]PETSC ERROR: -dm_plex_box_faces 4,4,4 [24]PETSC ERROR: -dm_plex_box_lower 0,0,0 [24]PETSC ERROR: -dm_plex_box_upper 2,2,2 [24]PETSC ERROR: -dm_plex_dim 3 [24]PETSC ERROR: -dm_plex_simplex 0 [24]PETSC ERROR: -dm_refine 4 [24]PETSC ERROR: -dm_vec_type kokkos [24]PETSC ERROR: -dm_view [24]PETSC ERROR: -ksp_converged_reason [24]PETSC ERROR: -ksp_max_it 200 [24]PETSC ERROR: -ksp_norm_type unpreconditioned [24]PETSC ERROR: -ksp_rtol 1.e-12 [24]PETSC ERROR: -ksp_type cg [24]PETSC ERROR: -log_view [24]PETSC ERROR: -mat_type aijkokkos [24]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [24]PETSC ERROR: -mg_levels_ksp_type chebyshev [24]PETSC ERROR: -mg_levels_pc_type jacobi [24]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [24]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [24]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [24]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [24]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [24]PETSC ERROR: -pc_gamg_process_eq_limit 400 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [18]PETSC ERROR: -dm_view [18]PETSC ERROR: -ksp_converged_reason [18]PETSC ERROR: -ksp_max_it 200 [18]PETSC ERROR: -ksp_norm_type unpreconditioned [18]PETSC ERROR: -ksp_rtol 1.e-12 [18]PETSC ERROR: -ksp_type cg [18]PETSC ERROR: -log_view [18]PETSC ERROR: -mat_type aijkokkos [18]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [18]PETSC ERROR: -mg_levels_ksp_type chebyshev [18]PETSC ERROR: -mg_levels_pc_type jacobi [18]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [18]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [18]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [18]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 [18]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [18]PETSC ERROR: -pc_gamg_process_eq_limit 400 [18]PETSC ERROR: -pc_gamg_repartition false [18]PETSC ERROR: -pc_gamg_reuse_interpolation true [18]PETSC ERROR: -pc_gamg_threshold 0.01 [18]PETSC ERROR: -pc_type gamg [18]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [18]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [24]PETSC ERROR: -pc_gamg_repartition false [24]PETSC ERROR: -pc_gamg_reuse_interpolation true [24]PETSC ERROR: -pc_gamg_threshold 0.01 [24]PETSC ERROR: -pc_type gamg [24]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [24]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [24]PETSC ERROR: -petscpartitioner_type simple [24]PETSC ERROR: -potential_petscspace_degree 2 [24]PETSC ERROR: -snes_lag_jacobian -2 [24]PETSC ERROR: -snes_max_it 1 [24]PETSC ERROR: -snes_rtol 1.e-8 [24]PETSC ERROR: -snes_type ksponly [24]PETSC ERROR: -use_gpu_aware_mpi 0 [24]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 24] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001121] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [18]PETSC ERROR: -petscpartitioner_type simple [18]PETSC ERROR: -potential_petscspace_degree 2 [18]PETSC ERROR: -snes_lag_jacobian -2 [18]PETSC ERROR: -snes_max_it 1 [18]PETSC ERROR: -snes_rtol 1.e-8 [18]PETSC ERROR: -snes_type ksponly [18]PETSC ERROR: -use_gpu_aware_mpi 0 [18]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 18] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001120] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [25]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 25] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001121] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [27]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [27]PETSC ERROR: -pc_gamg_process_eq_limit 400 [27]PETSC ERROR: -pc_gamg_repartition false [27]PETSC ERROR: -pc_gamg_reuse_interpolation true [27]PETSC ERROR: -pc_gamg_threshold 0.01 [27]PETSC ERROR: -pc_type gamg [27]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [27]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [27]PETSC ERROR: -petscpartitioner_type simple [27]PETSC ERROR: -potential_petscspace_degree 2 [27]PETSC ERROR: -snes_lag_jacobian -2 [27]PETSC ERROR: -snes_max_it 1 [27]PETSC ERROR: -snes_rtol 1.e-8 [27]PETSC ERROR: -snes_type ksponly [27]PETSC ERROR: -use_gpu_aware_mpi 0 [27]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 27] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001121] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [37]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [37]PETSC ERROR: -pc_gamg_process_eq_limit 400 [37]PETSC ERROR: -pc_gamg_repartition false [37]PETSC ERROR: -pc_gamg_reuse_interpolation true [37]PETSC ERROR: -pc_gamg_threshold 0.01 [37]PETSC ERROR: -pc_type gamg [37]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [37]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [37]PETSC ERROR: -petscpartitioner_type simple [37]PETSC ERROR: -potential_petscspace_degree 2 [37]PETSC ERROR: -snes_lag_jacobian -2 [37]PETSC ERROR: -snes_max_it 1 [37]PETSC ERROR: -snes_rtol 1.e-8 [37]PETSC ERROR: -snes_type ksponly [37]PETSC ERROR: -use_gpu_aware_mpi 0 [37]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 37] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001124] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 [28]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [28]PETSC ERROR: -pc_gamg_process_eq_limit 400 [28]PETSC ERROR: -pc_gamg_repartition false [28]PETSC ERROR: -pc_gamg_reuse_interpolation true [28]PETSC ERROR: -pc_gamg_threshold 0.01 [28]PETSC ERROR: -pc_type gamg [28]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [28]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [28]PETSC ERROR: -petscpartitioner_type simple [28]PETSC ERROR: -potential_petscspace_degree 2 [28]PETSC ERROR: -snes_lag_jacobian -2 [28]PETSC ERROR: -snes_max_it 1 [28]PETSC ERROR: -snes_rtol 1.e-8 [28]PETSC ERROR: -snes_type ksponly [28]PETSC ERROR: -use_gpu_aware_mpi 0 [28]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 28] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001121] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [29]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [29]PETSC ERROR: No support for this operation for this object type [29]PETSC ERROR: No method productsymbolic for Mat of type (null) [29]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [29]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) [29]PETSC ERROR: Option left: name:-mg_levels_ksp_chebyshev_esteig value: 0,0.05,0,1.05 [29]PETSC ERROR: Option left: name:-mg_levels_ksp_type value: chebyshev [29]PETSC ERROR: Option left: name:-mg_levels_pc_type value: jacobi [29]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [29]PETSC ERROR: Petsc Development GIT revision: v3.18.1-122-g50ed725bd85 GIT Date: 2022-10-28 19:54:01 +0000 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [29]PETSC ERROR: /global/u2/m/madams/petsc/src/snes/tests/data/../ex13 on a arch-perlmutter-opt-gcc-kokkos-cuda named nid001121 by madams Sun Oct 30 05:30:05 2022 [29]PETSC ERROR: Configure options --CFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CXXFLAGS=" -g -DPETSC_HAVE_KOKKOS_KERNELS_GMRES" --CUDAFLAGS="-g -Xcompiler -rdynamic" --with-cc=cc --with-cxx=CC --with-fc=ftn --download-hypre=1 --download-amgx --with-fortran-bindings=0 --COPTFLAGS=" -O" --CXXOPTFLAGS=" -O" --FOPTFLAGS=" -O" --with-debugging=0 --with-cuda=1 --with-cuda-arch=80 --with-mpiexec=srun --with-batch=0 --download-p4est=1 --with-zlib=1 --download-kokkos --download-kokkos-kernels --with-kokkos-kernels-tpl=0 --with-make-np=8 PETSC_ARCH=arch-perlmutter-opt-gcc-kokkos-cuda [29]PETSC ERROR: #1 MatProductSymbolic_MPIAIJKokkos_AB() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [29]PETSC ERROR: #2 MatProductSymbolic_MPIAIJKokkos() at /global/u2/m/madams/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [29]PETSC ERROR: #3 MatProductSymbolic() at /global/u2/m/madams/petsc/src/mat/interface/matproduct.c:793 [29]PETSC ERROR: #4 MatProduct_Private() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9820 [29]PETSC ERROR: #5 MatMatMult() at /global/u2/m/madams/petsc/src/mat/interface/matrix.c:9897 [29]PETSC ERROR: #6 PCGAMGOptProlongator_AGG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/agg.c:769 [29]PETSC ERROR: #7 PCSetUp_GAMG() at /global/u2/m/madams/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [29]PETSC ERROR: #8 PCSetUp() at /global/u2/m/madams/petsc/src/ksp/pc/interface/precon.c:994 [29]PETSC ERROR: #9 KSPSetUp() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:406 [29]PETSC ERROR: #10 KSPSolve_Private() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:825 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [29]PETSC ERROR: #11 KSPSolve() at /global/u2/m/madams/petsc/src/ksp/ksp/interface/itfunc.c:1071 [29]PETSC ERROR: #12 SNESSolve_KSPONLY() at /global/u2/m/madams/petsc/src/snes/impls/ksponly/ksponly.c:48 [29]PETSC ERROR: #13 SNESSolve() at /global/u2/m/madams/petsc/src/snes/interface/snes.c:4689 [29]PETSC ERROR: #14 main() at ex13.c:178 [29]PETSC ERROR: PETSc Option Table entries: [29]PETSC ERROR: -benchmark_it 10 [29]PETSC ERROR: -dm_distribute [29]PETSC ERROR: -dm_mat_type aijkokkos [29]PETSC ERROR: -dm_plex_box_faces 4,4,4 [29]PETSC ERROR: -dm_plex_box_lower 0,0,0 [29]PETSC ERROR: -dm_plex_box_upper 2,2,2 [29]PETSC ERROR: -dm_plex_dim 3 [29]PETSC ERROR: -dm_plex_simplex 0 [29]PETSC ERROR: -dm_refine 4 [29]PETSC ERROR: -dm_vec_type kokkos [29]PETSC ERROR: -dm_view [29]PETSC ERROR: -ksp_converged_reason [29]PETSC ERROR: -ksp_max_it 200 [29]PETSC ERROR: -ksp_norm_type unpreconditioned [29]PETSC ERROR: -ksp_rtol 1.e-12 [29]PETSC ERROR: -ksp_type cg [29]PETSC ERROR: -log_view [29]PETSC ERROR: -mat_type aijkokkos [29]PETSC ERROR: -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 [29]PETSC ERROR: -mg_levels_ksp_type chebyshev [29]PETSC ERROR: -mg_levels_pc_type jacobi [29]PETSC ERROR: -pc_gamg_aggressive_coarsening 1 [29]PETSC ERROR: -pc_gamg_coarse_eq_limit 100 [29]PETSC ERROR: -pc_gamg_coarse_grid_layout_type compact [29]PETSC ERROR: -pc_gamg_esteig_ksp_max_it 10 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [31]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [31]PETSC ERROR: -pc_gamg_process_eq_limit 400 [31]PETSC ERROR: -pc_gamg_repartition false [31]PETSC ERROR: -pc_gamg_reuse_interpolation true [31]PETSC ERROR: -pc_gamg_threshold 0.01 [31]PETSC ERROR: -pc_type gamg [31]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [31]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [31]PETSC ERROR: -petscpartitioner_type simple [31]PETSC ERROR: -potential_petscspace_degree 2 [31]PETSC ERROR: -snes_lag_jacobian -2 [31]PETSC ERROR: -snes_max_it 1 [31]PETSC ERROR: -snes_rtol 1.e-8 [31]PETSC ERROR: -snes_type ksponly [31]PETSC ERROR: -use_gpu_aware_mpi 0 [31]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 31] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001121] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [29]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [29]PETSC ERROR: -pc_gamg_process_eq_limit 400 [29]PETSC ERROR: -pc_gamg_repartition false [29]PETSC ERROR: -pc_gamg_reuse_interpolation true [29]PETSC ERROR: -pc_gamg_threshold 0.01 [29]PETSC ERROR: -pc_type gamg [29]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [29]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [29]PETSC ERROR: -petscpartitioner_type simple [29]PETSC ERROR: -potential_petscspace_degree 2 [29]PETSC ERROR: -snes_lag_jacobian -2 [29]PETSC ERROR: -snes_max_it 1 [29]PETSC ERROR: -snes_rtol 1.e-8 [29]PETSC ERROR: -snes_type ksponly [29]PETSC ERROR: -use_gpu_aware_mpi 0 [29]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 29] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001121] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [21]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [21]PETSC ERROR: -pc_gamg_process_eq_limit 400 [21]PETSC ERROR: -pc_gamg_repartition false [21]PETSC ERROR: -pc_gamg_reuse_interpolation true [21]PETSC ERROR: -pc_gamg_threshold 0.01 [21]PETSC ERROR: -pc_type gamg [21]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [21]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [21]PETSC ERROR: -petscpartitioner_type simple [21]PETSC ERROR: -potential_petscspace_degree 2 [21]PETSC ERROR: -snes_lag_jacobian -2 [21]PETSC ERROR: -snes_max_it 1 [21]PETSC ERROR: -snes_rtol 1.e-8 [21]PETSC ERROR: -snes_type ksponly [21]PETSC ERROR: -use_gpu_aware_mpi 0 [21]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 21] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001120] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [45]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [45]PETSC ERROR: -pc_gamg_process_eq_limit 400 [45]PETSC ERROR: -pc_gamg_repartition false [45]PETSC ERROR: -pc_gamg_reuse_interpolation true [45]PETSC ERROR: -pc_gamg_threshold 0.01 [45]PETSC ERROR: -pc_type gamg [45]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [45]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [45]PETSC ERROR: -petscpartitioner_type simple [45]PETSC ERROR: -potential_petscspace_degree 2 [45]PETSC ERROR: -snes_lag_jacobian -2 [45]PETSC ERROR: -snes_max_it 1 [45]PETSC ERROR: -snes_rtol 1.e-8 [45]PETSC ERROR: -snes_type ksponly [45]PETSC ERROR: -use_gpu_aware_mpi 0 [45]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 45] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001125] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() [47]PETSC ERROR: -pc_gamg_esteig_ksp_type cg [47]PETSC ERROR: -pc_gamg_process_eq_limit 400 [47]PETSC ERROR: -pc_gamg_repartition false [47]PETSC ERROR: -pc_gamg_reuse_interpolation true [47]PETSC ERROR: -pc_gamg_threshold 0.01 [47]PETSC ERROR: -pc_type gamg [47]PETSC ERROR: -petscpartitioner_simple_node_grid 2,2,2 [47]PETSC ERROR: -petscpartitioner_simple_process_grid 2,2,2 [47]PETSC ERROR: -petscpartitioner_type simple [47]PETSC ERROR: -potential_petscspace_degree 2 [47]PETSC ERROR: -snes_lag_jacobian -2 [47]PETSC ERROR: -snes_max_it 1 [47]PETSC ERROR: -snes_rtol 1.e-8 [47]PETSC ERROR: -snes_type ksponly [47]PETSC ERROR: -use_gpu_aware_mpi 0 [47]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- MPICH ERROR [Rank 47] [job id 3522949.0] [Sun Oct 30 05:30:12 2022] [nid001125] - Abort(56) (rank 0 in comm 16): application called MPI_Abort(MPI_COMM_SELF, 56) - process 0 Kokkos::Cuda ERROR: Failed to call Kokkos::Cuda::finalize() srun: error: nid001124: task 36: Exited with exit code 56 srun: launch/slurm: _step_signal: Terminating StepId=3522949.0 slurmstepd: error: *** STEP 3522949.0 ON nid001116 CANCELLED AT 2022-10-30T12:30:12 *** srun: error: nid001125: tasks 45-46: Exited with exit code 56 srun: error: nid001128: task 54: Exited with exit code 56 srun: error: nid001121: tasks 25,27-28: Exited with exit code 56 srun: error: nid001120: task 18: Exited with exit code 56 srun: error: nid001116: tasks 2,4: Exited with exit code 56 srun: error: nid001129: task 63: Exited with exit code 56 srun: error: nid001117: tasks 9,11,13: Exited with exit code 56 srun: error: nid001124: tasks 35,37-38: Exited with exit code 56 srun: error: nid001128: tasks 50,55: Exited with exit code 56 srun: error: nid001120: tasks 16,19,22: Exited with exit code 56 srun: error: nid001129: tasks 61-62: Exited with exit code 56 srun: error: nid001125: tasks 41,44: Exited with exit code 56 srun: error: nid001116: tasks 1,5-6: Exited with exit code 56 srun: error: nid001128: task 52: Exited with exit code 56 srun: error: nid001121: task 29: Exited with exit code 56 srun: error: nid001129: tasks 57,59: Exited with exit code 56 srun: error: nid001124: task 33: Exited with exit code 56 srun: error: nid001117: tasks 12,15: Exited with exit code 56 srun: error: nid001125: tasks 43,47: Exited with exit code 56 srun: error: nid001128: tasks 51,53: Exited with exit code 56 srun: error: nid001121: tasks 26,31: Exited with exit code 56 srun: error: nid001120: tasks 17,20: Exited with exit code 56 srun: error: nid001124: tasks 32,39: Exited with exit code 56 srun: error: nid001116: tasks 0,3: Exited with exit code 56 srun: error: nid001129: tasks 56,60: Exited with exit code 56 srun: error: nid001120: task 23: Exited with exit code 56 srun: error: nid001128: task 48: Exited with exit code 56 srun: error: nid001121: task 30: Exited with exit code 56 srun: error: nid001125: task 42: Exited with exit code 56 srun: error: nid001117: tasks 8,14: Exited with exit code 56 srun: error: nid001125: task 40: Exited with exit code 56 srun: error: nid001120: task 21: Exited with exit code 56 srun: error: nid001128: task 49: Exited with exit code 56 srun: error: nid001124: task 34: Exited with exit code 56 srun: error: nid001129: task 58: Exited with exit code 56 srun: error: nid001117: task 10: Exited with exit code 56 srun: error: nid001121: task 24: Exited with exit code 56 srun: error: nid001116: task 7: Exited with exit code 56 + date Sun 30 Oct 2022 05:30:13 AM PDT -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 905439 bytes Desc: not available URL: From jed at jedbrown.org Wed Nov 16 07:57:44 2022 From: jed at jedbrown.org (Jed Brown) Date: Wed, 16 Nov 2022 06:57:44 -0700 Subject: [petsc-users] AMD vs Intel mobile CPU performance In-Reply-To: References: Message-ID: <87iljf80iv.fsf@jedbrown.org> If you're using iterative solvers, compare memory bandwidth first, then cache. Flops aren't very important unless you use sparse direct solvers or have SNES residual/Jacobian evaluation that is expensive and has been written for vectorization. If you can get the 6650U with LPDDR5-6400, it'll probably be faster. My laptop is the previous generation, 5900HS. "D.J. Nolte" writes: > Hi all, > I'm looking for a small laptop which I'll be using (also) for small scale > PETSc (KSP & SNES) simulations. For this setting performance is not that > important, but still, I wonder if the community has any experience with AMD > Ryzen CPUs (specifically 5 Pro 6650U) CPUs compared to Intel i7 12th gen. > Do I have to expect significant performance differences? > > Thanks! > > David From facklerpw at ornl.gov Wed Nov 16 13:38:15 2022 From: facklerpw at ornl.gov (Fackler, Philip) Date: Wed, 16 Nov 2022 19:38:15 +0000 Subject: [petsc-users] [EXTERNAL] Re: Kokkos backend for Mat and Vec diverging when running on CUDA device. In-Reply-To: References: Message-ID: ------------------------------------------------------------------ PETSc Performance Summary: ------------------------------------------------------------------ Unknown Name on a named PC0115427 with 1 processor, by 4pf Wed Nov 16 14:36:46 2022 Using Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT Date: 2022-10-28 14:39:41 +0000 Max Max/Min Avg Total Time (sec): 6.023e+00 1.000 6.023e+00 Objects: 1.020e+02 1.000 1.020e+02 Flops: 1.080e+09 1.000 1.080e+09 1.080e+09 Flops/sec: 1.793e+08 1.000 1.793e+08 1.793e+08 MPI Msg Count: 0.000e+00 0.000 0.000e+00 0.000e+00 MPI Msg Len (bytes): 0.000e+00 0.000 0.000e+00 0.000e+00 MPI Reductions: 0.000e+00 0.000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flop ------ --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total Count %Total Avg %Total Count %Total 0: Main Stage: 6.0226e+00 100.0% 1.0799e+09 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent AvgLen: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) GPU Mflop/s: 10e-6 * (sum of flop on GPU over all processors)/(max GPU time over all processors) CpuToGpu Count: total number of CPU to GPU copies per processor CpuToGpu Size (Mbytes): 10e-6 * (total size of CPU to GPU copies per processor) GpuToCpu Count: total number of GPU to CPU copies per processor GpuToCpu Size (Mbytes): 10e-6 * (total size of GPU to CPU copies per processor) GPU %F: percent flops on GPU in this event ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage ---- Total GPU - CpuToGpu - - GpuToCpu - GPU Max Ratio Max Ratio Max Ratio Mess AvgLen Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s Mflop/s Count Size Count Size %F ------------------------------------------------------------------------------------------------------------------------ --------------------------------------- --- Event Stage 0: Main Stage BuildTwoSided 3 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 DMCreateMat 1 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 SFSetGraph 3 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 SFSetUp 3 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 SFPack 4647 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 SFUnpack 4647 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 VecDot 190 1.0 nan nan 2.11e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 VecMDot 775 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 VecNorm 1728 1.0 nan nan 1.92e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 VecScale 1983 1.0 nan nan 6.24e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 VecCopy 780 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 VecSet 4955 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 2 0 0 0 0 2 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 VecAXPY 190 1.0 nan nan 2.11e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 VecAYPX 597 1.0 nan nan 6.64e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 VecAXPBYCZ 643 1.0 nan nan 1.79e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 VecWAXPY 502 1.0 nan nan 5.58e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 VecMAXPY 1159 1.0 nan nan 3.68e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 3 0 0 0 0 3 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 VecScatterBegin 4647 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 -nan -nan 2 5.14e-03 0 0.00e+00 0 VecScatterEnd 4647 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 VecReduceArith 380 1.0 nan nan 4.23e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 VecReduceComm 190 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 VecNormalize 965 1.0 nan nan 1.61e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 TSStep 20 1.0 5.8699e+00 1.0 1.08e+09 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 184 -nan 2 5.14e-03 0 0.00e+00 54 TSFunctionEval 597 1.0 nan nan 6.64e+06 1.0 0.0e+00 0.0e+00 0.0e+00 63 1 0 0 0 63 1 0 0 0 -nan -nan 1 3.36e-04 0 0.00e+00 100 TSJacobianEval 190 1.0 nan nan 3.37e+07 1.0 0.0e+00 0.0e+00 0.0e+00 24 3 0 0 0 24 3 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 97 MatMult 1930 1.0 nan nan 4.46e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 41 0 0 0 1 41 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 MatMultTranspose 1 1.0 nan nan 3.44e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 MatSolve 965 1.0 nan nan 5.04e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 5 0 0 0 1 5 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 MatSOR 965 1.0 nan nan 3.33e+08 1.0 0.0e+00 0.0e+00 0.0e+00 4 31 0 0 0 4 31 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 MatLUFactorSym 1 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 MatLUFactorNum 190 1.0 nan nan 1.16e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 11 0 0 0 1 11 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 MatScale 190 1.0 nan nan 3.26e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 3 0 0 0 0 3 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 MatAssemblyBegin 761 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 MatAssemblyEnd 761 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 MatGetRowIJ 1 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 MatCreateSubMats 380 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 MatGetOrdering 1 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 MatZeroEntries 379 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 MatSetPreallCOO 1 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 MatSetValuesCOO 190 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 KSPSetUp 760 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 KSPSolve 190 1.0 5.8052e-01 1.0 9.30e+08 1.0 0.0e+00 0.0e+00 0.0e+00 10 86 0 0 0 10 86 0 0 0 1602 -nan 1 4.80e-03 0 0.00e+00 46 KSPGMRESOrthog 775 1.0 nan nan 2.27e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 SNESSolve 71 1.0 5.7117e+00 1.0 1.07e+09 1.0 0.0e+00 0.0e+00 0.0e+00 95 99 0 0 0 95 99 0 0 0 188 -nan 1 4.80e-03 0 0.00e+00 53 SNESSetUp 1 1.0 nan nan 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 SNESFunctionEval 573 1.0 nan nan 2.23e+07 1.0 0.0e+00 0.0e+00 0.0e+00 60 2 0 0 0 60 2 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 SNESJacobianEval 190 1.0 nan nan 3.37e+07 1.0 0.0e+00 0.0e+00 0.0e+00 24 3 0 0 0 24 3 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 97 SNESLineSearch 190 1.0 nan nan 1.05e+08 1.0 0.0e+00 0.0e+00 0.0e+00 53 10 0 0 0 53 10 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 100 PCSetUp 570 1.0 nan nan 1.16e+08 1.0 0.0e+00 0.0e+00 0.0e+00 2 11 0 0 0 2 11 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 PCApply 965 1.0 nan nan 6.14e+08 1.0 0.0e+00 0.0e+00 0.0e+00 8 57 0 0 0 8 57 0 0 0 -nan -nan 1 4.80e-03 0 0.00e+00 19 KSPSolve_FS_0 965 1.0 nan nan 3.33e+08 1.0 0.0e+00 0.0e+00 0.0e+00 4 31 0 0 0 4 31 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 KSPSolve_FS_1 965 1.0 nan nan 1.66e+08 1.0 0.0e+00 0.0e+00 0.0e+00 2 15 0 0 0 2 15 0 0 0 -nan -nan 0 0.00e+00 0 0.00e+00 0 --- Event Stage 1: Unknown ------------------------------------------------------------------------------------------------------------------------ --------------------------------------- Object Type Creations Destructions. Reports information only for process 0. --- Event Stage 0: Main Stage Container 5 5 Distributed Mesh 2 2 Index Set 11 11 IS L to G Mapping 1 1 Star Forest Graph 7 7 Discrete System 2 2 Weak Form 2 2 Vector 49 49 TSAdapt 1 1 TS 1 1 DMTS 1 1 SNES 1 1 DMSNES 3 3 SNESLineSearch 1 1 Krylov Solver 4 4 DMKSP interface 1 1 Matrix 4 4 Preconditioner 4 4 Viewer 2 1 --- Event Stage 1: Unknown ======================================================================================================================== Average time to get PetscTime(): 3.14e-08 #PETSc Option Table entries: -log_view -log_view_gpu_times #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with 64 bit PetscInt Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 8 Configure options: PETSC_DIR=/home/4pf/repos/petsc PETSC_ARCH=arch-kokkos-cuda-no-tpls --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-cuda --with-debugging=0 --with-shared-libraries --prefix=/home/4pf/build/petsc/cuda-no-tpls/install --with-64-bit-indices --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --CUDAOPTFLAGS=-O3 --with-kokkos-dir=/home/4pf/build/kokkos/cuda/install --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/cuda-no-tpls/install ----------------------------------------- Libraries compiled on 2022-11-01 21:01:08 on PC0115427 Machine characteristics: Linux-5.15.0-52-generic-x86_64-with-glibc2.35 Using PETSc directory: /home/4pf/build/petsc/cuda-no-tpls/install Using PETSc arch: ----------------------------------------- Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -Wno-lto-type-mismatch -Wno-stringop-overflow -fstack-protector -fvisibility=hidden -O3 ----------------------------------------- Using include paths: -I/home/4pf/build/petsc/cuda-no-tpls/install/include -I/home/4pf/build/kokkos-kernels/cuda-no-tpls/install/include -I/home/4pf/build/kokkos/cuda/install/include -I/usr/local/cuda-11.8/include ----------------------------------------- Using C linker: mpicc Using libraries: -Wl,-rpath,/home/4pf/build/petsc/cuda-no-tpls/install/lib -L/home/4pf/build/petsc/cuda-no-tpls/install/lib -lpetsc -Wl,-rpath,/home/4pf/build/kokkos-kernels/cuda-no-tpls/install/lib -L/home/4pf/build/kokkos-kernels/cuda-no-tpls/install/lib -Wl,-rpath,/home/4pf/build/kokkos/cuda/install/lib -L/home/4pf/build/kokkos/cuda/install/lib -Wl,-rpath,/usr/local/cuda-11.8/lib64 -L/usr/local/cuda-11.8/lib64 -L/usr/local/cuda-11.8/lib64/stubs -lkokkoskernels -lkokkoscontainers -lkokkoscore -llapack -lblas -lm -lcudart -lnvToolsExt -lcufft -lcublas -lcusparse -lcusolver -lcurand -lcuda -lquadmath -lstdc++ -ldl ----------------------------------------- Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory ________________________________ From: Junchao Zhang Sent: Tuesday, November 15, 2022 13:03 To: Fackler, Philip Cc: xolotl-psi-development at lists.sourceforge.net ; petsc-users at mcs.anl.gov ; Blondel, Sophie ; Roth, Philip Subject: Re: [EXTERNAL] Re: [petsc-users] Kokkos backend for Mat and Vec diverging when running on CUDA device. Can you paste -log_view result so I can see what functions are used? --Junchao Zhang On Tue, Nov 15, 2022 at 10:24 AM Fackler, Philip > wrote: Yes, most (but not all) of our system test cases fail with the kokkos/cuda or cuda backends. All of them pass with the CPU-only kokkos backend. Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory ________________________________ From: Junchao Zhang > Sent: Monday, November 14, 2022 19:34 To: Fackler, Philip > Cc: xolotl-psi-development at lists.sourceforge.net >; petsc-users at mcs.anl.gov >; Blondel, Sophie >; Zhang, Junchao >; Roth, Philip > Subject: [EXTERNAL] Re: [petsc-users] Kokkos backend for Mat and Vec diverging when running on CUDA device. Hi, Philip, Sorry to hear that. It seems you could run the same code on CPUs but not no GPUs (with either petsc/Kokkos backend or petsc/cuda backend, is it right? --Junchao Zhang On Mon, Nov 14, 2022 at 12:13 PM Fackler, Philip via petsc-users > wrote: This is an issue I've brought up before (and discussed in-person with Richard). I wanted to bring it up again because I'm hitting the limits of what I know to do, and I need help figuring this out. The problem can be reproduced using Xolotl's "develop" branch built against a petsc build with kokkos and kokkos-kernels enabled. Then, either add the relevant kokkos options to the "petscArgs=" line in the system test parameter file(s), or just replace the system test parameter files with the ones from the "feature-petsc-kokkos" branch. See here the files that begin with "params_system_". Note that those files use the "kokkos" options, but the problem is similar using the corresponding cuda/cusparse options. I've already tried building kokkos-kernels with no TPLs and got slightly different results, but the same problem. Any help would be appreciated. Thanks, Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Wed Nov 16 13:51:03 2022 From: mfadams at lbl.gov (Mark Adams) Date: Wed, 16 Nov 2022 14:51:03 -0500 Subject: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases In-Reply-To: References: Message-ID: I am able to reproduce this on Crusher with 8 processors. Junchao, did you want me to use --download-kokkos-commit=origin/develop ? On Wed, Nov 16, 2022 at 8:05 AM Mark Adams wrote: > I can not build right now on Crusher or Perlmutter but I saw this on both. > > Here is an example output using src/snes/tests/ex13.c using the appended > .petscrc > This uses 64 processors and the 8 processor case worked. This has been > semi-nondertminisitc for me. > > (and I have attached my current Perlmutter problem) > > Hope this helps, > Mark > > -dm_plex_simplex 0 > -dm_plex_dim 3 > -dm_plex_box_lower 0,0,0 > -dm_plex_box_upper 1,1,1 > -petscpartitioner_simple_process_grid 2,2,2 > -potential_petscspace_degree 2 > -snes_max_it 1 > -ksp_max_it 200 > -ksp_type cg > -ksp_rtol 1.e-12 > -ksp_norm_type unpreconditioned > -snes_rtol 1.e-8 > #-pc_type gamg > #-pc_gamg_type agg > #-pc_gamg_agg_nsmooths 1 > -pc_gamg_coarse_eq_limit 100 > -pc_gamg_process_eq_limit 400 > -pc_gamg_reuse_interpolation true > #-snes_monitor > #-ksp_monitor_short > -ksp_converged_reason > #-ksp_view > #-snes_converged_reason > #-mg_levels_ksp_max_it 2 > -mg_levels_ksp_type chebyshev > #-mg_levels_ksp_type richardson > #-mg_levels_ksp_richardson_scale 0.8 > -mg_levels_pc_type jacobi > -pc_gamg_esteig_ksp_type cg > -pc_gamg_esteig_ksp_max_it 10 > -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 > -dm_distribute > -petscpartitioner_type simple > -pc_gamg_repartition false > -pc_gamg_coarse_grid_layout_type compact > -pc_gamg_threshold 0.01 > #-pc_gamg_threshold_scale .5 > -pc_gamg_aggressive_coarsening 1 > #-check_pointer_intensity 0 > -snes_type ksponly > #-mg_coarse_sub_pc_factor_mat_solver_type cusparse > #-info :pc > #-use_gpu_aware_mpi 1 > -options_left > #-malloc_debug > -benchmark_it 10 > #-pc_gamg_use_parallel_coarse_grid_solver > #-mg_coarse_pc_type jacobi > #-mg_coarse_ksp_type cg > #-mg_coarse_ksp_rtol 1.e-2 > #-mat_cusparse_transgen > -snes_lag_jacobian -2 > > > On Tue, Nov 15, 2022 at 3:42 PM Junchao Zhang > wrote: > >> Mark, >> Do you have a reproducer using petsc examples? >> >> On Tue, Nov 15, 2022, 12:49 PM Mark Adams wrote: >> >>> Junchao, this is the same problem that I have been having right? >>> >>> On Tue, Nov 15, 2022 at 11:56 AM Fackler, Philip via petsc-users < >>> petsc-users at mcs.anl.gov> wrote: >>> >>>> I built petsc with: >>>> >>>> $ ./configure PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug >>>> --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-debugging=0 >>>> --prefix=$HOME/build/petsc/debug/install --with-64-bit-indices >>>> --with-shared-libraries --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --download-kokkos >>>> --download-kokkos-kernels >>>> >>>> $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug all >>>> >>>> $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug install >>>> >>>> >>>> Then I build xolotl in a separate build directory (after checking out >>>> the "feature-petsc-kokkos" branch) with: >>>> >>>> $ cmake -DCMAKE_BUILD_TYPE=Debug >>>> -DKokkos_DIR=$HOME/build/petsc/debug/install >>>> -DPETSC_DIR=$HOME/build/petsc/debug/install >>>> >>>> $ make -j4 SystemTester >>>> >>>> >>>> Then, from the xolotl build directory, run (for example): >>>> >>>> $ mpirun -n 2 ./test/system/SystemTester -t System/NE_4 -- -v >>>> >>>> Note that this test case will use the parameter file >>>> '/benchmarks/params_system_NE_4.txt' which has the command-line >>>> arguments for petsc in its "petscArgs=..." line. If you look at >>>> '/test/system/SystemTester.cpp' all the system test cases >>>> follow the same naming convention with their corresponding parameter files >>>> under '/benchmarks'. >>>> >>>> The failure happens with the NE_4 case (which is 2D) and the PSI_3 case >>>> (which is 1D). >>>> >>>> Let me know if this is still unclear. >>>> >>>> Thanks, >>>> >>>> >>>> *Philip Fackler * >>>> Research Software Engineer, Application Engineering Group >>>> Advanced Computing Systems Research Section >>>> Computer Science and Mathematics Division >>>> *Oak Ridge National Laboratory* >>>> ------------------------------ >>>> *From:* Junchao Zhang >>>> *Sent:* Tuesday, November 15, 2022 00:16 >>>> *To:* Fackler, Philip >>>> *Cc:* petsc-users at mcs.anl.gov ; Blondel, >>>> Sophie >>>> *Subject:* [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with >>>> COO interface crashes in some cases >>>> >>>> Hi, Philip, >>>> Can you tell me instructions to build Xolotl to reproduce the error? >>>> --Junchao Zhang >>>> >>>> >>>> On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users < >>>> petsc-users at mcs.anl.gov> wrote: >>>> >>>> In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use >>>> the COO interface for preallocating and setting values in the Jacobian >>>> matrix. I have found that with some of our test cases, using more than one >>>> MPI rank results in a crash. Way down in the preconditioner code in petsc a >>>> Mat gets computed that has "null" for the "productsymbolic" member of its >>>> "ops". It's pretty far removed from where we compute the Jacobian entries, >>>> so I haven't been able (so far) to track it back to an error in my code. >>>> I'd appreciate some help with this from someone who is more familiar with >>>> the petsc guts so we can figure out what I'm doing wrong. (I'm assuming >>>> it's a bug in Xolotl.) >>>> >>>> Note that this is using the kokkos backend for Mat and Vec in petsc, >>>> but with a serial-only build of kokkos and kokkos-kernels. So, it's a >>>> CPU-only multiple MPI rank run. >>>> >>>> Here's a paste of the error output showing the relevant parts of the >>>> call stack: >>>> >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] --------------------- Error Message >>>> -------------------------------------------------------------- >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] --------------------- Error Message >>>> -------------------------------------------------------------- >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] No support for this operation for this object type >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] No support for this operation for this object type >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] No method productsymbolic for Mat of type (null) >>>> [ERROR] No method productsymbolic for Mat of type (null) >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. >>>> [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT >>>> Date: 2022-10-28 14:39:41 +0000 >>>> [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT >>>> Date: 2022-10-28 14:39:41 +0000 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 >>>> 2022 >>>> [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 >>>> 2022 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc >>>> PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc >>>> --with-cxx=mpicxx --with-fc=0 --with-cudac=0 >>>> --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices >>>> --with-shared-libraries >>>> --with-kokkos-dir=/home/4pf/build/kokkos/serial/install >>>> --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install >>>> [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc >>>> PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc >>>> --with-cxx=mpicxx --with-fc=0 --with-cudac=0 >>>> --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices >>>> --with-shared-libraries >>>> --with-kokkos-dir=/home/4pf/build/kokkos/serial/install >>>> --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at >>>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 >>>> [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at >>>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at >>>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 >>>> [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at >>>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #3 MatProductSymbolic() at >>>> /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 >>>> [ERROR] #3 MatProductSymbolic() at >>>> /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #4 MatProduct_Private() at >>>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 >>>> [ERROR] #4 MatProduct_Private() at >>>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] #5 MatMatMult() at >>>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 >>>> [ERROR] #5 MatMatMult() at >>>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] #6 PCGAMGOptProlongator_AGG() at >>>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 >>>> [ERROR] #6 PCGAMGOptProlongator_AGG() at >>>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] #7 PCSetUp_GAMG() at >>>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 >>>> [ERROR] #7 PCSetUp_GAMG() at >>>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #8 PCSetUp() at >>>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 >>>> [ERROR] #8 PCSetUp() at >>>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #9 KSPSetUp() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 >>>> [ERROR] #9 KSPSetUp() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #10 KSPSolve_Private() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 >>>> [ERROR] #10 KSPSolve_Private() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] #11 KSPSolve() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>>> [ERROR] #11 KSPSolve() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #12 PCApply_FieldSplit() at >>>> /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 >>>> [ERROR] #12 PCApply_FieldSplit() at >>>> /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #13 PCApply() at >>>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 >>>> [ERROR] #13 PCApply() at >>>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #14 KSP_PCApply() at >>>> /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 >>>> [ERROR] #14 KSP_PCApply() at >>>> /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #15 KSPFGMRESCycle() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 >>>> [ERROR] #15 KSPFGMRESCycle() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #16 KSPSolve_FGMRES() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 >>>> [ERROR] #16 KSPSolve_FGMRES() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #17 KSPSolve_Private() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 >>>> [ERROR] #17 KSPSolve_Private() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] #18 KSPSolve() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>>> [ERROR] #18 KSPSolve() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] #19 SNESSolve_NEWTONLS() at >>>> /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 >>>> [ERROR] #19 SNESSolve_NEWTONLS() at >>>> /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #20 SNESSolve() at >>>> /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 >>>> [ERROR] #20 SNESSolve() at >>>> /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #21 TSStep_ARKIMEX() at >>>> /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 >>>> [ERROR] #21 TSStep_ARKIMEX() at >>>> /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 >>>> [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #23 TSSolve() at >>>> /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 >>>> [ERROR] #23 TSSolve() at >>>> /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 >>>> [ERROR] PetscSolver::solve: TSSolve failed. >>>> [ERROR] PetscSolver::solve: TSSolve failed. >>>> Aborting. >>>> Aborting. >>>> >>>> >>>> >>>> Thanks for the help, >>>> >>>> >>>> *Philip Fackler * >>>> Research Software Engineer, Application Engineering Group >>>> Advanced Computing Systems Research Section >>>> Computer Science and Mathematics Division >>>> *Oak Ridge National Laboratory* >>>> >>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Wed Nov 16 17:32:09 2022 From: mfadams at lbl.gov (Mark Adams) Date: Wed, 16 Nov 2022 18:32:09 -0500 Subject: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases In-Reply-To: References: Message-ID: Junchao, I tried with the Kokkos dev branch and get this with 8 processes (and the .petscrc file that I sent/appended): srun -n8 -N1 --gpus-per-task=1 --gpu-bind=closest ../ex13 -dm_plex_box_faces 2,2,2 -petscpartitioner_simple_process_grid 2,2,2 -dm_plex_box_upper 1,1,1 -petscpartitioner_simple_node_grid 1,1,1 -dm_refine 4 -dm_view -log_tracexxx -log_view -dm_mat_type aijkokkos -dm_vec_type kokkos DM Object: box 8 MPI processes type: plex box in 3 dimensions: Number of 0-cells per rank: 4913 4913 4913 4913 4913 4913 4913 4913 Number of 1-cells per rank: 13872 13872 13872 13872 13872 13872 13872 13872 Number of 2-cells per rank: 13056 13056 13056 13056 13056 13056 13056 13056 Number of 3-cells per rank: 4096 4096 4096 4096 4096 4096 4096 4096 Labels: celltype: 4 strata with value/size (0 (4913), 1 (13872), 4 (13056), 7 (4096)) depth: 4 strata with value/size (0 (4913), 1 (13872), 2 (13056), 3 (4096)) marker: 1 strata with value/size (1 (3169)) Face Sets: 3 strata with value/size (1 (961), 3 (961), 6 (961)) Number equations N = 250047 [5]PETSC ERROR: ------------------------------------------------------------------------ [5]PETSC ERROR: Caught signal number 7 BUS: Bus Error, possibly illegal memory access [5]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [5]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and https://petsc.org/release/faq/ [5]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [5]PETSC ERROR: The line numbers in the error traceback are not always exact. [5]PETSC ERROR: #1 MPI function [5]PETSC ERROR: #2 PetscSFLinkWaitRequests_MPI() at /gpfs/alpine/csc314/scratch/adams/petsc/src/vec/is/sf/impls/basic/sfmpi.c:53 [5]PETSC ERROR: #3 PetscSFLinkFinishCommunication() at /gpfs/alpine/csc314/scratch/adams/petsc/include/../src/vec/is/sf/impls/basic/sfpack.h:277 [5]PETSC ERROR: #4 PetscSFBcastEnd_Basic() at /gpfs/alpine/csc314/scratch/adams/petsc/src/vec/is/sf/impls/basic/sfbasic.c:205 [5]PETSC ERROR: #5 PetscSFBcastEnd() at /gpfs/alpine/csc314/scratch/adams/petsc/src/vec/is/sf/interface/sf.c:1477 [5]PETSC ERROR: #6 DMGlobalToLocalEnd() at /gpfs/alpine/csc314/scratch/adams/petsc/src/dm/interface/dm.c:2849 [5]PETSC ERROR: #7 SNESComputeFunction_DMLocal() at /gpfs/alpine/csc314/scratch/adams/petsc/src/snes/utils/dmlocalsnes.c:65 [5]PETSC ERROR: #8 SNES callback function [5]PETSC ERROR: #9 SNESComputeFunction() at /gpfs/alpine/csc314/scratch/adams/petsc/src/snes/interface/snes.c:2436 [5]PETSC ERROR: #10 SNESSolve_KSPONLY() at /gpfs/alpine/csc314/scratch/adams/petsc/src/snes/impls/ksponly/ksponly.c:27 [5]PETSC ERROR: #11 SNESSolve() at /gpfs/alpine/csc314/scratch/adams/petsc/src/snes/interface/snes.c:4690 [5]PETSC ERROR: #12 main() at ex13.c:178 MPICH ERROR [Rank 5] [job id 213404.0] [Wed Nov 16 18:30:49 2022] [crusher011] - Abort(59) (rank 5 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 59) - process 5 On Wed, Nov 16, 2022 at 2:51 PM Mark Adams wrote: > I am able to reproduce this on Crusher with 8 processors. > > Junchao, did you want me to use --download-kokkos-commit=origin/develop ? > > On Wed, Nov 16, 2022 at 8:05 AM Mark Adams wrote: > >> I can not build right now on Crusher or Perlmutter but I saw this on both. >> >> Here is an example output using src/snes/tests/ex13.c using the appended >> .petscrc >> This uses 64 processors and the 8 processor case worked. This has been >> semi-nondertminisitc for me. >> >> (and I have attached my current Perlmutter problem) >> >> Hope this helps, >> Mark >> >> -dm_plex_simplex 0 >> -dm_plex_dim 3 >> -dm_plex_box_lower 0,0,0 >> -dm_plex_box_upper 1,1,1 >> -petscpartitioner_simple_process_grid 2,2,2 >> -potential_petscspace_degree 2 >> -snes_max_it 1 >> -ksp_max_it 200 >> -ksp_type cg >> -ksp_rtol 1.e-12 >> -ksp_norm_type unpreconditioned >> -snes_rtol 1.e-8 >> #-pc_type gamg >> #-pc_gamg_type agg >> #-pc_gamg_agg_nsmooths 1 >> -pc_gamg_coarse_eq_limit 100 >> -pc_gamg_process_eq_limit 400 >> -pc_gamg_reuse_interpolation true >> #-snes_monitor >> #-ksp_monitor_short >> -ksp_converged_reason >> #-ksp_view >> #-snes_converged_reason >> #-mg_levels_ksp_max_it 2 >> -mg_levels_ksp_type chebyshev >> #-mg_levels_ksp_type richardson >> #-mg_levels_ksp_richardson_scale 0.8 >> -mg_levels_pc_type jacobi >> -pc_gamg_esteig_ksp_type cg >> -pc_gamg_esteig_ksp_max_it 10 >> -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 >> -dm_distribute >> -petscpartitioner_type simple >> -pc_gamg_repartition false >> -pc_gamg_coarse_grid_layout_type compact >> -pc_gamg_threshold 0.01 >> #-pc_gamg_threshold_scale .5 >> -pc_gamg_aggressive_coarsening 1 >> #-check_pointer_intensity 0 >> -snes_type ksponly >> #-mg_coarse_sub_pc_factor_mat_solver_type cusparse >> #-info :pc >> #-use_gpu_aware_mpi 1 >> -options_left >> #-malloc_debug >> -benchmark_it 10 >> #-pc_gamg_use_parallel_coarse_grid_solver >> #-mg_coarse_pc_type jacobi >> #-mg_coarse_ksp_type cg >> #-mg_coarse_ksp_rtol 1.e-2 >> #-mat_cusparse_transgen >> -snes_lag_jacobian -2 >> >> >> On Tue, Nov 15, 2022 at 3:42 PM Junchao Zhang >> wrote: >> >>> Mark, >>> Do you have a reproducer using petsc examples? >>> >>> On Tue, Nov 15, 2022, 12:49 PM Mark Adams wrote: >>> >>>> Junchao, this is the same problem that I have been having right? >>>> >>>> On Tue, Nov 15, 2022 at 11:56 AM Fackler, Philip via petsc-users < >>>> petsc-users at mcs.anl.gov> wrote: >>>> >>>>> I built petsc with: >>>>> >>>>> $ ./configure PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug >>>>> --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-debugging=0 >>>>> --prefix=$HOME/build/petsc/debug/install --with-64-bit-indices >>>>> --with-shared-libraries --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --download-kokkos >>>>> --download-kokkos-kernels >>>>> >>>>> $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug all >>>>> >>>>> $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug install >>>>> >>>>> >>>>> Then I build xolotl in a separate build directory (after checking out >>>>> the "feature-petsc-kokkos" branch) with: >>>>> >>>>> $ cmake -DCMAKE_BUILD_TYPE=Debug >>>>> -DKokkos_DIR=$HOME/build/petsc/debug/install >>>>> -DPETSC_DIR=$HOME/build/petsc/debug/install >>>>> >>>>> $ make -j4 SystemTester >>>>> >>>>> >>>>> Then, from the xolotl build directory, run (for example): >>>>> >>>>> $ mpirun -n 2 ./test/system/SystemTester -t System/NE_4 -- -v >>>>> >>>>> Note that this test case will use the parameter file >>>>> '/benchmarks/params_system_NE_4.txt' which has the command-line >>>>> arguments for petsc in its "petscArgs=..." line. If you look at >>>>> '/test/system/SystemTester.cpp' all the system test cases >>>>> follow the same naming convention with their corresponding parameter files >>>>> under '/benchmarks'. >>>>> >>>>> The failure happens with the NE_4 case (which is 2D) and the PSI_3 >>>>> case (which is 1D). >>>>> >>>>> Let me know if this is still unclear. >>>>> >>>>> Thanks, >>>>> >>>>> >>>>> *Philip Fackler * >>>>> Research Software Engineer, Application Engineering Group >>>>> Advanced Computing Systems Research Section >>>>> Computer Science and Mathematics Division >>>>> *Oak Ridge National Laboratory* >>>>> ------------------------------ >>>>> *From:* Junchao Zhang >>>>> *Sent:* Tuesday, November 15, 2022 00:16 >>>>> *To:* Fackler, Philip >>>>> *Cc:* petsc-users at mcs.anl.gov ; Blondel, >>>>> Sophie >>>>> *Subject:* [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with >>>>> COO interface crashes in some cases >>>>> >>>>> Hi, Philip, >>>>> Can you tell me instructions to build Xolotl to reproduce the error? >>>>> --Junchao Zhang >>>>> >>>>> >>>>> On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users < >>>>> petsc-users at mcs.anl.gov> wrote: >>>>> >>>>> In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to >>>>> use the COO interface for preallocating and setting values in the Jacobian >>>>> matrix. I have found that with some of our test cases, using more than one >>>>> MPI rank results in a crash. Way down in the preconditioner code in petsc a >>>>> Mat gets computed that has "null" for the "productsymbolic" member of its >>>>> "ops". It's pretty far removed from where we compute the Jacobian entries, >>>>> so I haven't been able (so far) to track it back to an error in my code. >>>>> I'd appreciate some help with this from someone who is more familiar with >>>>> the petsc guts so we can figure out what I'm doing wrong. (I'm assuming >>>>> it's a bug in Xolotl.) >>>>> >>>>> Note that this is using the kokkos backend for Mat and Vec in petsc, >>>>> but with a serial-only build of kokkos and kokkos-kernels. So, it's a >>>>> CPU-only multiple MPI rank run. >>>>> >>>>> Here's a paste of the error output showing the relevant parts of the >>>>> call stack: >>>>> >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] --------------------- Error Message >>>>> -------------------------------------------------------------- >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] --------------------- Error Message >>>>> -------------------------------------------------------------- >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] No support for this operation for this object type >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] No support for this operation for this object type >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] No method productsymbolic for Mat of type (null) >>>>> [ERROR] No method productsymbolic for Mat of type (null) >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. >>>>> [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT >>>>> Date: 2022-10-28 14:39:41 +0000 >>>>> [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT >>>>> Date: 2022-10-28 14:39:41 +0000 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 >>>>> 2022 >>>>> [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 >>>>> 2022 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc >>>>> PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc >>>>> --with-cxx=mpicxx --with-fc=0 --with-cudac=0 >>>>> --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices >>>>> --with-shared-libraries >>>>> --with-kokkos-dir=/home/4pf/build/kokkos/serial/install >>>>> --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install >>>>> [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc >>>>> PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc >>>>> --with-cxx=mpicxx --with-fc=0 --with-cudac=0 >>>>> --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices >>>>> --with-shared-libraries >>>>> --with-kokkos-dir=/home/4pf/build/kokkos/serial/install >>>>> --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at >>>>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 >>>>> [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at >>>>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at >>>>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 >>>>> [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at >>>>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #3 MatProductSymbolic() at >>>>> /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 >>>>> [ERROR] #3 MatProductSymbolic() at >>>>> /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #4 MatProduct_Private() at >>>>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 >>>>> [ERROR] #4 MatProduct_Private() at >>>>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] #5 MatMatMult() at >>>>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 >>>>> [ERROR] #5 MatMatMult() at >>>>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] #6 PCGAMGOptProlongator_AGG() at >>>>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 >>>>> [ERROR] #6 PCGAMGOptProlongator_AGG() at >>>>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] #7 PCSetUp_GAMG() at >>>>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 >>>>> [ERROR] #7 PCSetUp_GAMG() at >>>>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #8 PCSetUp() at >>>>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 >>>>> [ERROR] #8 PCSetUp() at >>>>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #9 KSPSetUp() at >>>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 >>>>> [ERROR] #9 KSPSetUp() at >>>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #10 KSPSolve_Private() at >>>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 >>>>> [ERROR] #10 KSPSolve_Private() at >>>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] #11 KSPSolve() at >>>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>>>> [ERROR] #11 KSPSolve() at >>>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #12 PCApply_FieldSplit() at >>>>> /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 >>>>> [ERROR] #12 PCApply_FieldSplit() at >>>>> /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #13 PCApply() at >>>>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 >>>>> [ERROR] #13 PCApply() at >>>>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #14 KSP_PCApply() at >>>>> /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 >>>>> [ERROR] #14 KSP_PCApply() at >>>>> /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #15 KSPFGMRESCycle() at >>>>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 >>>>> [ERROR] #15 KSPFGMRESCycle() at >>>>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #16 KSPSolve_FGMRES() at >>>>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 >>>>> [ERROR] #16 KSPSolve_FGMRES() at >>>>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #17 KSPSolve_Private() at >>>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 >>>>> [ERROR] #17 KSPSolve_Private() at >>>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] #18 KSPSolve() at >>>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>>>> [ERROR] #18 KSPSolve() at >>>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] #19 SNESSolve_NEWTONLS() at >>>>> /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 >>>>> [ERROR] #19 SNESSolve_NEWTONLS() at >>>>> /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #20 SNESSolve() at >>>>> /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 >>>>> [ERROR] #20 SNESSolve() at >>>>> /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #21 TSStep_ARKIMEX() at >>>>> /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 >>>>> [ERROR] #21 TSStep_ARKIMEX() at >>>>> /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #22 TSStep() at >>>>> /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 >>>>> [ERROR] #22 TSStep() at >>>>> /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 >>>>> [ERROR] [1]PETSC ERROR: >>>>> [ERROR] [0]PETSC ERROR: >>>>> [ERROR] #23 TSSolve() at >>>>> /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 >>>>> [ERROR] #23 TSSolve() at >>>>> /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 >>>>> [ERROR] PetscSolver::solve: TSSolve failed. >>>>> [ERROR] PetscSolver::solve: TSSolve failed. >>>>> Aborting. >>>>> Aborting. >>>>> >>>>> >>>>> >>>>> Thanks for the help, >>>>> >>>>> >>>>> *Philip Fackler * >>>>> Research Software Engineer, Application Engineering Group >>>>> Advanced Computing Systems Research Section >>>>> Computer Science and Mathematics Division >>>>> *Oak Ridge National Laboratory* >>>>> >>>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From karthikeyan.chockalingam at stfc.ac.uk Wed Nov 16 18:04:25 2022 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Thu, 17 Nov 2022 00:04:25 +0000 Subject: [petsc-users] Different solution while running in parallel Message-ID: Hello, I tried to solve a (FE discretized) Poisson equation using PCLU. For some reason I am getting different solutions while running the problem on one and two cores. I have attached the output file (out.txt) from both the runs. I am printing A, b and x from both the runs ? while A and b are the same but the solution seems is different. I am not sure what I doing wrong. Below is my matrix, vector, and solve setup. Mat A; Vec b, x; ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr); ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr); ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ(ierr); ierr = MatMPIAIJSetPreallocation(A,d_nz, NULL, o_nz, NULL); CHKERRQ(ierr); ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE); CHKERRQ(ierr); ierr = MatCreateVecs(A, &b, &x); CHKERRQ(ierr); KSP ksp; PC pc; KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetOperators(ksp, A, A); ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr); ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); ierr = PCSetType(pc,PCLU);CHKERRQ(ierr); ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr); KSPSolve(ksp, b, x); Thank you for your help. Karhik. This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: out.txt URL: From hzhang at mcs.anl.gov Wed Nov 16 20:07:20 2022 From: hzhang at mcs.anl.gov (Zhang, Hong) Date: Thu, 17 Nov 2022 02:07:20 +0000 Subject: [petsc-users] Different solution while running in parallel In-Reply-To: References: Message-ID: Karhik, Can you find out the condition number of your matrix? Hong ________________________________ From: petsc-users on behalf of Karthikeyan Chockalingam - STFC UKRI via petsc-users Sent: Wednesday, November 16, 2022 6:04 PM To: petsc-users at mcs.anl.gov Subject: [petsc-users] Different solution while running in parallel Hello, I tried to solve a (FE discretized) Poisson equation using PCLU. For some reason I am getting different solutions while running the problem on one and two cores. I have attached the output file (out.txt) from both the runs. I am printing A, b and x from both the runs ? while A and b are the same but the solution seems is different. I am not sure what I doing wrong. Below is my matrix, vector, and solve setup. Mat A; Vec b, x; ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr); ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr); ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ(ierr); ierr = MatMPIAIJSetPreallocation(A,d_nz, NULL, o_nz, NULL); CHKERRQ(ierr); ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE); CHKERRQ(ierr); ierr = MatCreateVecs(A, &b, &x); CHKERRQ(ierr); KSP ksp; PC pc; KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetOperators(ksp, A, A); ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr); ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); ierr = PCSetType(pc,PCLU);CHKERRQ(ierr); ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr); KSPSolve(ksp, b, x); Thank you for your help. Karhik. This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses. -------------- next part -------------- An HTML attachment was scrubbed... URL: From narnoldm at umich.edu Wed Nov 16 21:27:12 2022 From: narnoldm at umich.edu (Nicholas Arnold-Medabalimi) Date: Wed, 16 Nov 2022 22:27:12 -0500 Subject: [petsc-users] PetscSF Fortran interface Message-ID: Hi Petsc Users I'm in the process of adding some Petsc for mesh management into an existing Fortran Solver. It has been relatively straightforward so far but I am running into an issue with using PetscSF routines. Some like the PetscSFGetGraph work no problem but a few of my routines require the use of PetscSFGetLeafRanks and PetscSFGetRootRanks and those don't seem to be in the fortran interface and I just get a linking error. I also don't seem to see a PetscSF file in the finclude. Any clarification or assistance would be appreciated. Sincerely Nicholas -- Nicholas Arnold-Medabalimi Ph.D. Candidate Computational Aeroscience Lab University of Michigan -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Wed Nov 16 21:38:36 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Wed, 16 Nov 2022 21:38:36 -0600 Subject: [petsc-users] PetscSF Fortran interface In-Reply-To: References: Message-ID: Hi, Nicholas, I will have a look and get back to you. Thanks. --Junchao Zhang On Wed, Nov 16, 2022 at 9:27 PM Nicholas Arnold-Medabalimi < narnoldm at umich.edu> wrote: > Hi Petsc Users > > I'm in the process of adding some Petsc for mesh management into an > existing Fortran Solver. It has been relatively straightforward so far but > I am running into an issue with using PetscSF routines. Some like the > PetscSFGetGraph work no problem but a few of my routines require the use of > PetscSFGetLeafRanks and PetscSFGetRootRanks and those don't seem to be in > the fortran interface and I just get a linking error. I also don't seem to > see a PetscSF file in the finclude. Any clarification or assistance would > be appreciated. > > > Sincerely > Nicholas > > -- > Nicholas Arnold-Medabalimi > > Ph.D. Candidate > Computational Aeroscience Lab > University of Michigan > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 17 06:19:19 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 17 Nov 2022 07:19:19 -0500 Subject: [petsc-users] Different solution while running in parallel In-Reply-To: References: Message-ID: On Wed, Nov 16, 2022 at 9:07 PM Zhang, Hong via petsc-users < petsc-users at mcs.anl.gov> wrote: > Karhik, > Can you find out the condition number of your matrix? > Also, run using -ksp_view -ksp_monitor_true_residual -ksp_converged_reason and send the two outputs. Thanks, Matt > Hong > > ------------------------------ > *From:* petsc-users on behalf of > Karthikeyan Chockalingam - STFC UKRI via petsc-users < > petsc-users at mcs.anl.gov> > *Sent:* Wednesday, November 16, 2022 6:04 PM > *To:* petsc-users at mcs.anl.gov > *Subject:* [petsc-users] Different solution while running in parallel > > > Hello, > > > > I tried to solve a (FE discretized) Poisson equation using PCLU. For > some reason I am getting different solutions while running the problem on > one and two cores. I have attached the output file (out.txt) from both the > runs. I am printing A, b and x from both the runs ? while A and b are the > same but the solution seems is different. > > > > I am not sure what I doing wrong. > > > > Below is my matrix, vector, and solve setup. > > > > > > Mat A; > > Vec b, x; > > > > ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr); > > ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr); > > ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ > (ierr); > > ierr = MatMPIAIJSetPreallocation(A,d_nz, *NULL*, o_nz, *NULL*); > CHKERRQ(ierr); > > ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE); CHKERRQ(ierr); > > ierr = MatCreateVecs(A, &b, &x); CHKERRQ(ierr); > > > > KSP ksp; > > PC pc; > > KSPCreate(PETSC_COMM_WORLD, &ksp); > > KSPSetOperators(ksp, A, A); > > ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr); > > ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); > > ierr = PCSetType(pc,PCLU);CHKERRQ(ierr); > > ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr); > > KSPSolve(ksp, b, x); > > > > Thank you for your help. > > > > Karhik. > > > > This email and any attachments are intended solely for the use of the > named recipients. If you are not the intended recipient you must not use, > disclose, copy or distribute this email or any of its attachments and > should notify the sender immediately and delete this email from your > system. UK Research and Innovation (UKRI) has taken every reasonable > precaution to minimise risk of this email or any attachments containing > viruses or malware but the recipient should carry out its own virus and > malware checks before opening the attachments. UKRI does not accept any > liability for any losses or damages which the recipient may sustain due to > presence of any viruses. > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From yangzongze at gmail.com Thu Nov 17 08:19:06 2022 From: yangzongze at gmail.com (Zongze Yang) Date: Thu, 17 Nov 2022 22:19:06 +0800 Subject: [petsc-users] Build error with slepc: Unable to locate PETSc BAMG dynamic library Message-ID: Hello, I tried to build petsc with slepc. `make` give the following error information. How can I figure out the problem? The configure.log and make.log are attached. ``` *** Building SLEPc *** Checking environment... done Checking PETSc installation... done Processing slepc4py... [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Unable to open file [0]PETSC ERROR: Unable to locate PETSc BAMG dynamic library You cannot move the dynamic libraries! [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.18.1-291-g89fba64eb45 GIT Date: 2022-11-15 14:31:39 +0000 [0]PETSC ERROR: Unknown Name on a arch-main-debug named ws6 by z2yang Thu Nov 17 21:59:09 2022 [0]PETSC ERROR: Configure options PETSC_ARCH=arch-main-debug --download-bamg --download-bison --download-chaco --download-ctetgen --download-egads --download-eigen --download-exodusii --download-fftw --download-hpddm --download-ks --download-libceed --download-metis --download-ml --download-mmg --download-mumps --download-netcdf --download-opencascade --download-p4est --download-parmetis --download-parmmg --download-pnetcdf --download-pragmatic --download-ptscotch --download-scalapack --download-slepc --download-slepc-configure-arguments=--with-slepc4py=1 --download-suitesparse --download-superlu_dist --download-tetgen --download-triangle --download-cmake --download-hdf5 --download-mpich --download-mpi4py --download-slepc --download-zlib --download-libpng --download-muparser --with-petsc4py=1 --with-shared-libraries --with-x=1 --with-x-include="[/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libx11-1.7.0-5c4ah77x6u7zfm6msg6hbkt23vmwjgkz/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/xproto-7.0.31-z33ate5bew7b7xrpj3pv6nb3towcfimo/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/kbproto-1.0.7-ea2l5e2kp43i2b423eotqxseywjvqis6/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxcb-1.14-e2ea2x3zga5xipq5wvcgsw25ilq5yo63/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxau-1.0.8-gmwxeffxcbkmxvwawtndhutiwficmxwv/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxdmcp-1.1.2-bsggzn5pf6pu5guwbooi3riu5uhaqgee/include]" --with-x-lib="-L/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libx11-1.7.0-5c4ah77x6u7zfm6msg6hbkt23vmwjgkz/lib -lX11" --force [0]PETSC ERROR: #1 PetscInitialize_DynamicLibraries() at /home/z2yang/repos/petsc/src/sys/dll/reg.c:135 [0]PETSC ERROR: #2 PetscInitialize_Common() at /home/z2yang/repos/petsc/src/sys/objects/pinit.c:1025 [0]PETSC ERROR: #3 PetscInitialize() at /home/z2yang/repos/petsc/src/sys/objects/pinit.c:1267 Traceback (most recent call last): File "/home/z2yang/repos/petsc/arch-main-debug/externalpackages/git.slepc/./configure", line 11, in exec(open(os.path.join(os.path.dirname(__file__), 'config', 'configure.py')).read()) File "", line 215, in File "/home/z2yang/repos/petsc/arch-main-debug/externalpackages/git.slepc/config/packages/slepc4py.py", line 53, in Process from petsc4py import PETSc File "/home/z2yang/repos/petsc/arch-main-debug/lib/petsc4py/PETSc.py", line 4, in PETSc._initialize() File "PETSc/PETSc.pyx", line 509, in petsc4py.PETSc._initialize File "PETSc/PETSc.pyx", line 402, in petsc4py.PETSc.initialize petsc4py.PETSc.Error: error code 65 **************************ERROR************************************* Error building SLEPc. ******************************************************************** gmake[2]: *** [/home/z2yang/repos/petsc/arch-main-debug/lib/petsc/conf/petscrules:29: slepcbuild] Error 1 **************************ERROR************************************* Error during compile, check arch-main-debug/lib/petsc/conf/make.log Send it and arch-main-debug/lib/petsc/conf/configure.log to petsc-maint at mcs.anl.gov ******************************************************************** Finishing make run at Thu, 17 Nov 2022 21:59:09 +0800 ``` -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log Type: application/octet-stream Size: 65536 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 65536 bytes Desc: not available URL: From karthikeyan.chockalingam at stfc.ac.uk Thu Nov 17 08:37:11 2022 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Thu, 17 Nov 2022 14:37:11 +0000 Subject: [petsc-users] Different solution while running in parallel In-Reply-To: References: Message-ID: Hi Matt and Hong, Thank you for your response. I made the following changes, to get the desired output PetscReal norm; /* norm of solution error */ PetscInt its; KSPConvergedReason reason; PetscViewerAndFormat *vf; PetscViewerAndFormatCreate(PETSC_VIEWER_STDOUT_WORLD, PETSC_VIEWER_DEFAULT, &vf); ierr = KSPView(ksp, PETSC_VIEWER_STDOUT_SELF);CHKERRQ(ierr); KSPSolve(ksp, b, x); ierr = KSPMonitorTrueResidual(ksp,its,norm,vf);CHKERRQ(ierr); ierr = KSPMonitorSingularValue(ksp, its, norm, vf);CHKERRQ(ierr); I have attached the outputs from both the runs. As before, I am also printing A, b, and x. I wonder if it is a memory issue related to mpi library employed. I am currently using openmpi ? should I instead use mpich? Kind regards, Karthik. From: Matthew Knepley Date: Thursday, 17 November 2022 at 12:19 To: Zhang, Hong Cc: petsc-users at mcs.anl.gov , Chockalingam, Karthikeyan (STFC,DL,HC) Subject: Re: [petsc-users] Different solution while running in parallel On Wed, Nov 16, 2022 at 9:07 PM Zhang, Hong via petsc-users > wrote: Karhik, Can you find out the condition number of your matrix? Also, run using -ksp_view -ksp_monitor_true_residual -ksp_converged_reason and send the two outputs. Thanks, Matt Hong ________________________________ From: petsc-users > on behalf of Karthikeyan Chockalingam - STFC UKRI via petsc-users > Sent: Wednesday, November 16, 2022 6:04 PM To: petsc-users at mcs.anl.gov > Subject: [petsc-users] Different solution while running in parallel Hello, I tried to solve a (FE discretized) Poisson equation using PCLU. For some reason I am getting different solutions while running the problem on one and two cores. I have attached the output file (out.txt) from both the runs. I am printing A, b and x from both the runs ? while A and b are the same but the solution seems is different. I am not sure what I doing wrong. Below is my matrix, vector, and solve setup. Mat A; Vec b, x; ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr); ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr); ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ(ierr); ierr = MatMPIAIJSetPreallocation(A,d_nz, NULL, o_nz, NULL); CHKERRQ(ierr); ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE); CHKERRQ(ierr); ierr = MatCreateVecs(A, &b, &x); CHKERRQ(ierr); KSP ksp; PC pc; KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetOperators(ksp, A, A); ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr); ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); ierr = PCSetType(pc,PCLU);CHKERRQ(ierr); ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr); KSPSolve(ksp, b, x); Thank you for your help. Karhik. This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: 1_MPI_processes.txt URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: 2_MPI_processes.txt URL: From karthikeyan.chockalingam at stfc.ac.uk Thu Nov 17 08:40:04 2022 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Thu, 17 Nov 2022 14:40:04 +0000 Subject: [petsc-users] Different solution while running in parallel In-Reply-To: References: Message-ID: I also get the following bit get the converged reason. KSPConvergedReason reason; KSPGetConvergedReason(ksp, &reason); if (reason == KSP_DIVERGED_INDEFINITE_PC) { PetscPrintf(PETSC_COMM_WORLD, "\nDivergence because of indefinite preconditioner;\n"); PetscPrintf(PETSC_COMM_WORLD, "Run the executable again but with '-pc_factor_shift_type POSITIVE_DEFINITE' option.\n"); } else if (reason < 0) { PetscPrintf(PETSC_COMM_WORLD, "\nOther kind of divergence: this should not happen.\n"); } else { KSPGetIterationNumber(ksp, &its); PetscPrintf(PETSC_COMM_WORLD,"\nConvergence in %d iterations.\n",(int)its); } Best, Karthik. From: Chockalingam, Karthikeyan (STFC,DL,HC) Date: Thursday, 17 November 2022 at 14:37 To: Matthew Knepley , Zhang, Hong Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Different solution while running in parallel Hi Matt and Hong, Thank you for your response. I made the following changes, to get the desired output PetscReal norm; /* norm of solution error */ PetscInt its; KSPConvergedReason reason; PetscViewerAndFormat *vf; PetscViewerAndFormatCreate(PETSC_VIEWER_STDOUT_WORLD, PETSC_VIEWER_DEFAULT, &vf); ierr = KSPView(ksp, PETSC_VIEWER_STDOUT_SELF);CHKERRQ(ierr); KSPSolve(ksp, b, x); ierr = KSPMonitorTrueResidual(ksp,its,norm,vf);CHKERRQ(ierr); ierr = KSPMonitorSingularValue(ksp, its, norm, vf);CHKERRQ(ierr); I have attached the outputs from both the runs. As before, I am also printing A, b, and x. I wonder if it is a memory issue related to mpi library employed. I am currently using openmpi ? should I instead use mpich? Kind regards, Karthik. From: Matthew Knepley Date: Thursday, 17 November 2022 at 12:19 To: Zhang, Hong Cc: petsc-users at mcs.anl.gov , Chockalingam, Karthikeyan (STFC,DL,HC) Subject: Re: [petsc-users] Different solution while running in parallel On Wed, Nov 16, 2022 at 9:07 PM Zhang, Hong via petsc-users > wrote: Karhik, Can you find out the condition number of your matrix? Also, run using -ksp_view -ksp_monitor_true_residual -ksp_converged_reason and send the two outputs. Thanks, Matt Hong ________________________________ From: petsc-users > on behalf of Karthikeyan Chockalingam - STFC UKRI via petsc-users > Sent: Wednesday, November 16, 2022 6:04 PM To: petsc-users at mcs.anl.gov > Subject: [petsc-users] Different solution while running in parallel Hello, I tried to solve a (FE discretized) Poisson equation using PCLU. For some reason I am getting different solutions while running the problem on one and two cores. I have attached the output file (out.txt) from both the runs. I am printing A, b and x from both the runs ? while A and b are the same but the solution seems is different. I am not sure what I doing wrong. Below is my matrix, vector, and solve setup. Mat A; Vec b, x; ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr); ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr); ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ(ierr); ierr = MatMPIAIJSetPreallocation(A,d_nz, NULL, o_nz, NULL); CHKERRQ(ierr); ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE); CHKERRQ(ierr); ierr = MatCreateVecs(A, &b, &x); CHKERRQ(ierr); KSP ksp; PC pc; KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetOperators(ksp, A, A); ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr); ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); ierr = PCSetType(pc,PCLU);CHKERRQ(ierr); ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr); KSPSolve(ksp, b, x); Thank you for your help. Karhik. This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Nov 17 09:08:51 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 17 Nov 2022 10:08:51 -0500 Subject: [petsc-users] Build error with slepc: Unable to locate PETSc BAMG dynamic library In-Reply-To: References: Message-ID: Your make.log is partial. It looks like you tried to build things and it failed, and then you tried again, but things were in a broken state. I would remove the whole source directory and start over. Thanks, Matt On Thu, Nov 17, 2022 at 9:21 AM Zongze Yang wrote: > Hello, I tried to build petsc with slepc. `make` give the following error > information. How can I figure out the problem? The configure.log and > make.log are attached. > > ``` > *** Building SLEPc *** > Checking environment... done > Checking PETSc installation... done > Processing slepc4py... [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Unable to open file > [0]PETSC ERROR: Unable to locate PETSc BAMG dynamic library > You cannot move the dynamic libraries! > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.18.1-291-g89fba64eb45 > GIT Date: 2022-11-15 14:31:39 +0000 > [0]PETSC ERROR: Unknown Name on a arch-main-debug named ws6 by z2yang Thu > Nov 17 21:59:09 2022 > [0]PETSC ERROR: Configure options PETSC_ARCH=arch-main-debug > --download-bamg --download-bison --download-chaco --download-ctetgen > --download-egads --download-eigen --download-exodusii --download-fftw > --download-hpddm --download-ks --download-libceed --download-metis > --download-ml --download-mmg --download-mumps --download-netcdf > --download-opencascade --download-p4est --download-parmetis > --download-parmmg --download-pnetcdf --download-pragmatic > --download-ptscotch --download-scalapack --download-slepc > --download-slepc-configure-arguments=--with-slepc4py=1 > --download-suitesparse --download-superlu_dist --download-tetgen > --download-triangle --download-cmake --download-hdf5 --download-mpich > --download-mpi4py --download-slepc --download-zlib --download-libpng > --download-muparser --with-petsc4py=1 --with-shared-libraries --with-x=1 > --with-x-include="[/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libx11-1.7.0-5c4ah77x6u7zfm6msg6hbkt23vmwjgkz/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/xproto-7.0.31-z33ate5bew7b7xrpj3pv6nb3towcfimo/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/kbproto-1.0.7-ea2l5e2kp43i2b423eotqxseywjvqis6/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxcb-1.14-e2ea2x3zga5xipq5wvcgsw25ilq5yo63/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxau-1.0.8-gmwxeffxcbkmxvwawtndhutiwficmxwv/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxdmcp-1.1.2-bsggzn5pf6pu5guwbooi3riu5uhaqgee/include]" > --with-x-lib="-L/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libx11-1.7.0-5c4ah77x6u7zfm6msg6hbkt23vmwjgkz/lib > -lX11" --force > [0]PETSC ERROR: #1 PetscInitialize_DynamicLibraries() at > /home/z2yang/repos/petsc/src/sys/dll/reg.c:135 > [0]PETSC ERROR: #2 PetscInitialize_Common() at > /home/z2yang/repos/petsc/src/sys/objects/pinit.c:1025 > [0]PETSC ERROR: #3 PetscInitialize() at > /home/z2yang/repos/petsc/src/sys/objects/pinit.c:1267 > Traceback (most recent call last): > File > "/home/z2yang/repos/petsc/arch-main-debug/externalpackages/git.slepc/./configure", > line 11, in > exec(open(os.path.join(os.path.dirname(__file__), 'config', > 'configure.py')).read()) > File "", line 215, in > File > "/home/z2yang/repos/petsc/arch-main-debug/externalpackages/git.slepc/config/packages/slepc4py.py", > line 53, in Process > from petsc4py import PETSc > File "/home/z2yang/repos/petsc/arch-main-debug/lib/petsc4py/PETSc.py", > line 4, in > PETSc._initialize() > File "PETSc/PETSc.pyx", line 509, in petsc4py.PETSc._initialize > File "PETSc/PETSc.pyx", line 402, in petsc4py.PETSc.initialize > petsc4py.PETSc.Error: error code 65 > **************************ERROR************************************* > Error building SLEPc. > ******************************************************************** > gmake[2]: *** > [/home/z2yang/repos/petsc/arch-main-debug/lib/petsc/conf/petscrules:29: > slepcbuild] Error 1 > **************************ERROR************************************* > Error during compile, check arch-main-debug/lib/petsc/conf/make.log > Send it and arch-main-debug/lib/petsc/conf/configure.log to > petsc-maint at mcs.anl.gov > ******************************************************************** > Finishing make run at Thu, 17 Nov 2022 21:59:09 +0800 > ``` > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre at joliv.et Thu Nov 17 09:10:30 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Thu, 17 Nov 2022 16:10:30 +0100 Subject: [petsc-users] Build error with slepc: Unable to locate PETSc BAMG dynamic library In-Reply-To: References: Message-ID: <968C504E-19A0-4939-93B5-3610357E1953@joliv.et> That?s a fun one. BAMG is built after PETSc and SLEPc. But slepc4py tries to import PETSc before BAMG has been built, so it?s chocking. Could you please try to change --download-slepc-configure-arguments=--with-slepc4py=1 to --download-slepc-configure-arguments=--with-slepc4py=1\ --have-petsc4py=1 Thanks, Pierre > On 17 Nov 2022, at 3:19 PM, Zongze Yang wrote: > > Hello, I tried to build petsc with slepc. `make` give the following error information. How can I figure out the problem? The configure.log and make.log are attached. > > ``` > *** Building SLEPc *** > Checking environment... done > Checking PETSc installation... done > Processing slepc4py... [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Unable to open file > [0]PETSC ERROR: Unable to locate PETSc BAMG dynamic library > You cannot move the dynamic libraries! > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.18.1-291-g89fba64eb45 GIT Date: 2022-11-15 14:31:39 +0000 > [0]PETSC ERROR: Unknown Name on a arch-main-debug named ws6 by z2yang Thu Nov 17 21:59:09 2022 > [0]PETSC ERROR: Configure options PETSC_ARCH=arch-main-debug --download-bamg --download-bison --download-chaco --download-ctetgen --download-egads --download-eigen --download-exodusii --download-fftw --download-hpddm --download-ks --download-libceed --download-metis --download-ml --download-mmg --download-mumps --download-netcdf --download-opencascade --download-p4est --download-parmetis --download-parmmg --download-pnetcdf --download-pragmatic --download-ptscotch --download-scalapack --download-slepc --download-slepc-configure-arguments=--with-slepc4py=1 --download-suitesparse --download-superlu_dist --download-tetgen --download-triangle --download-cmake --download-hdf5 --download-mpich --download-mpi4py --download-slepc --download-zlib --download-libpng --download-muparser --with-petsc4py=1 --with-shared-libraries --with-x=1 --with-x-include="[/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libx11-1.7.0-5c4ah77x6u7zfm6msg6hbkt23vmwjgkz/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/xproto-7.0.31-z33ate5bew7b7xrpj3pv6nb3towcfimo/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/kbproto-1.0.7-ea2l5e2kp43i2b423eotqxseywjvqis6/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxcb-1.14-e2ea2x3zga5xipq5wvcgsw25ilq5yo63/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxau-1.0.8-gmwxeffxcbkmxvwawtndhutiwficmxwv/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxdmcp-1.1.2-bsggzn5pf6pu5guwbooi3riu5uhaqgee/include]" --with-x-lib="-L/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libx11-1.7.0-5c4ah77x6u7zfm6msg6hbkt23vmwjgkz/lib -lX11" --force > [0]PETSC ERROR: #1 PetscInitialize_DynamicLibraries() at /home/z2yang/repos/petsc/src/sys/dll/reg.c:135 > [0]PETSC ERROR: #2 PetscInitialize_Common() at /home/z2yang/repos/petsc/src/sys/objects/pinit.c:1025 > [0]PETSC ERROR: #3 PetscInitialize() at /home/z2yang/repos/petsc/src/sys/objects/pinit.c:1267 > Traceback (most recent call last): > File "/home/z2yang/repos/petsc/arch-main-debug/externalpackages/git.slepc/./configure", line 11, in > exec(open(os.path.join(os.path.dirname(__file__), 'config', 'configure.py')).read()) > File "", line 215, in > File "/home/z2yang/repos/petsc/arch-main-debug/externalpackages/git.slepc/config/packages/slepc4py.py", line 53, in Process > from petsc4py import PETSc > File "/home/z2yang/repos/petsc/arch-main-debug/lib/petsc4py/PETSc.py", line 4, in > PETSc._initialize() > File "PETSc/PETSc.pyx", line 509, in petsc4py.PETSc._initialize > File "PETSc/PETSc.pyx", line 402, in petsc4py.PETSc.initialize > petsc4py.PETSc.Error: error code 65 > **************************ERROR************************************* > Error building SLEPc. > ******************************************************************** > gmake[2]: *** [/home/z2yang/repos/petsc/arch-main-debug/lib/petsc/conf/petscrules:29: slepcbuild] Error 1 > **************************ERROR************************************* > Error during compile, check arch-main-debug/lib/petsc/conf/make.log > Send it and arch-main-debug/lib/petsc/conf/configure.log to petsc-maint at mcs.anl.gov > ******************************************************************** > Finishing make run at Thu, 17 Nov 2022 21:59:09 +0800 > ``` > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Thu Nov 17 09:11:25 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Thu, 17 Nov 2022 09:11:25 -0600 (CST) Subject: [petsc-users] Build error with slepc: Unable to locate PETSc BAMG dynamic library In-Reply-To: References: Message-ID: > --download-bamg --download-slepc --download-slepc-configure-arguments=--with-slepc4py=1 I guess this won't really work as the order of build should be: - petsc - slepc - bamg - slepc4py And its not easy to do this via configure -without hacks. Currently the above build has the order (hence fails): - petsc - slepc - slepc4py - bamg I guess the alternative is: build slepc4py separately after petsc/slepc/bamg are built. Satish On Thu, 17 Nov 2022, Zongze Yang wrote: > Hello, I tried to build petsc with slepc. `make` give the following error > information. How can I figure out the problem? The configure.log and > make.log are attached. > > ``` > *** Building SLEPc *** > Checking environment... done > Checking PETSc installation... done > Processing slepc4py... [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Unable to open file > [0]PETSC ERROR: Unable to locate PETSc BAMG dynamic library > You cannot move the dynamic libraries! > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.18.1-291-g89fba64eb45 > GIT Date: 2022-11-15 14:31:39 +0000 > [0]PETSC ERROR: Unknown Name on a arch-main-debug named ws6 by z2yang Thu > Nov 17 21:59:09 2022 > [0]PETSC ERROR: Configure options PETSC_ARCH=arch-main-debug > --download-bamg --download-bison --download-chaco --download-ctetgen > --download-egads --download-eigen --download-exodusii --download-fftw > --download-hpddm --download-ks --download-libceed --download-metis > --download-ml --download-mmg --download-mumps --download-netcdf > --download-opencascade --download-p4est --download-parmetis > --download-parmmg --download-pnetcdf --download-pragmatic > --download-ptscotch --download-scalapack --download-slepc > --download-slepc-configure-arguments=--with-slepc4py=1 > --download-suitesparse --download-superlu_dist --download-tetgen > --download-triangle --download-cmake --download-hdf5 --download-mpich > --download-mpi4py --download-slepc --download-zlib --download-libpng > --download-muparser --with-petsc4py=1 --with-shared-libraries --with-x=1 > --with-x-include="[/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libx11-1.7.0-5c4ah77x6u7zfm6msg6hbkt23vmwjgkz/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/xproto-7.0.31-z33ate5bew7b7xrpj3pv6nb3towcfimo/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/kbproto-1.0.7-ea2l5e2kp43i2b423eotqxseywjvqis6/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxcb-1.14-e2ea2x3zga5xipq5wvcgsw25ilq5yo63/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxau-1.0.8-gmwxeffxcbkmxvwawtndhutiwficmxwv/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxdmcp-1.1.2-bsggzn5pf6pu5guwbooi3riu5uhaqgee/include]" > --with-x-lib="-L/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libx11-1.7.0-5c4ah77x6u7zfm6msg6hbkt23vmwjgkz/lib > -lX11" --force > [0]PETSC ERROR: #1 PetscInitialize_DynamicLibraries() at > /home/z2yang/repos/petsc/src/sys/dll/reg.c:135 > [0]PETSC ERROR: #2 PetscInitialize_Common() at > /home/z2yang/repos/petsc/src/sys/objects/pinit.c:1025 > [0]PETSC ERROR: #3 PetscInitialize() at > /home/z2yang/repos/petsc/src/sys/objects/pinit.c:1267 > Traceback (most recent call last): > File > "/home/z2yang/repos/petsc/arch-main-debug/externalpackages/git.slepc/./configure", > line 11, in > exec(open(os.path.join(os.path.dirname(__file__), 'config', > 'configure.py')).read()) > File "", line 215, in > File > "/home/z2yang/repos/petsc/arch-main-debug/externalpackages/git.slepc/config/packages/slepc4py.py", > line 53, in Process > from petsc4py import PETSc > File "/home/z2yang/repos/petsc/arch-main-debug/lib/petsc4py/PETSc.py", > line 4, in > PETSc._initialize() > File "PETSc/PETSc.pyx", line 509, in petsc4py.PETSc._initialize > File "PETSc/PETSc.pyx", line 402, in petsc4py.PETSc.initialize > petsc4py.PETSc.Error: error code 65 > **************************ERROR************************************* > Error building SLEPc. > ******************************************************************** > gmake[2]: *** > [/home/z2yang/repos/petsc/arch-main-debug/lib/petsc/conf/petscrules:29: > slepcbuild] Error 1 > **************************ERROR************************************* > Error during compile, check arch-main-debug/lib/petsc/conf/make.log > Send it and arch-main-debug/lib/petsc/conf/configure.log to > petsc-maint at mcs.anl.gov > ******************************************************************** > Finishing make run at Thu, 17 Nov 2022 21:59:09 +0800 > ``` > From knepley at gmail.com Thu Nov 17 09:16:01 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 17 Nov 2022 10:16:01 -0500 Subject: [petsc-users] Different solution while running in parallel In-Reply-To: References: Message-ID: Using options instead of code will make your life much easier. Two thing are wrong here: 1) Your solver is doing no iterates because the initial residual is very small, 5.493080158227e-15. The LU does not matter. In order to check the condition number of your system, run with -pc_type svd -pc_svd_monitor 2) Your parallel run also does no iterates 0 KSP none resid norm 6.951601853367e-310 true resid norm 1.058300524426e+01 ||r(i)||/||b|| 8.819171036882e-01 but the true residual is not small. That means that your system is singular, but you have given a consistent RHS. Thanks, Matt On Thu, Nov 17, 2022 at 9:37 AM Karthikeyan Chockalingam - STFC UKRI < karthikeyan.chockalingam at stfc.ac.uk> wrote: > Hi Matt and Hong, > > > > Thank you for your response. > > I made the following changes, to get the desired output > > > > PetscReal norm; /* norm of solution error */ > > PetscInt its; > > KSPConvergedReason reason; > > PetscViewerAndFormat *vf; > > PetscViewerAndFormatCreate(PETSC_VIEWER_STDOUT_WORLD, > PETSC_VIEWER_DEFAULT, &vf); > > > > ierr = KSPView(ksp, PETSC_VIEWER_STDOUT_SELF);CHKERRQ(ierr); > > > > KSPSolve(ksp, b, x); > > > > ierr = KSPMonitorTrueResidual(ksp,its,norm,vf);CHKERRQ(ierr); > > ierr = KSPMonitorSingularValue(ksp, its, norm, vf);CHKERRQ(ierr); > > > > I have attached the outputs from both the runs. As before, I am also > printing A, b, and x. > > > > I wonder if it is a memory issue related to mpi library employed. I am > currently using openmpi ? should I instead use mpich? > > > > Kind regards, > > Karthik. > > > > *From: *Matthew Knepley > *Date: *Thursday, 17 November 2022 at 12:19 > *To: *Zhang, Hong > *Cc: *petsc-users at mcs.anl.gov , Chockalingam, > Karthikeyan (STFC,DL,HC) > *Subject: *Re: [petsc-users] Different solution while running in parallel > > On Wed, Nov 16, 2022 at 9:07 PM Zhang, Hong via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > Karhik, > > Can you find out the condition number of your matrix? > > > > Also, run using > > > > -ksp_view -ksp_monitor_true_residual -ksp_converged_reason > > > > and send the two outputs. > > > > Thanks, > > > > Matt > > > > Hong > > > ------------------------------ > > *From:* petsc-users on behalf of > Karthikeyan Chockalingam - STFC UKRI via petsc-users < > petsc-users at mcs.anl.gov> > *Sent:* Wednesday, November 16, 2022 6:04 PM > *To:* petsc-users at mcs.anl.gov > *Subject:* [petsc-users] Different solution while running in parallel > > > > Hello, > > > > I tried to solve a (FE discretized) Poisson equation using PCLU. For > some reason I am getting different solutions while running the problem on > one and two cores. I have attached the output file (out.txt) from both the > runs. I am printing A, b and x from both the runs ? while A and b are the > same but the solution seems is different. > > > > I am not sure what I doing wrong. > > > > Below is my matrix, vector, and solve setup. > > > > > > Mat A; > > Vec b, x; > > > > ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr); > > ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr); > > ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ > (ierr); > > ierr = MatMPIAIJSetPreallocation(A,d_nz, *NULL*, o_nz, *NULL*); > CHKERRQ(ierr); > > ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE); CHKERRQ(ierr); > > ierr = MatCreateVecs(A, &b, &x); CHKERRQ(ierr); > > > > KSP ksp; > > PC pc; > > KSPCreate(PETSC_COMM_WORLD, &ksp); > > KSPSetOperators(ksp, A, A); > > ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr); > > ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); > > ierr = PCSetType(pc,PCLU);CHKERRQ(ierr); > > ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr); > > KSPSolve(ksp, b, x); > > > > Thank you for your help. > > > > Karhik. > > > > This email and any attachments are intended solely for the use of the > named recipients. If you are not the intended recipient you must not use, > disclose, copy or distribute this email or any of its attachments and > should notify the sender immediately and delete this email from your > system. UK Research and Innovation (UKRI) has taken every reasonable > precaution to minimise risk of this email or any attachments containing > viruses or malware but the recipient should carry out its own virus and > malware checks before opening the attachments. UKRI does not accept any > liability for any losses or damages which the recipient may sustain due to > presence of any viruses. > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Thu Nov 17 09:23:07 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Thu, 17 Nov 2022 09:23:07 -0600 (CST) Subject: [petsc-users] Build error with slepc: Unable to locate PETSc BAMG dynamic library In-Reply-To: <968C504E-19A0-4939-93B5-3610357E1953@joliv.et> References: <968C504E-19A0-4939-93B5-3610357E1953@joliv.et> Message-ID: <9f327c6a-8136-ac92-40fc-d04e7a43dbe8@mcs.anl.gov> Ah - so --have-petsc4py=1 disables this check in slepc? Ok - tried that - and it works for me. [balay at pj01 petsc]$ ./configure --download-bamg --download-slepc --download-petsc4py --download-slepc-configure-arguments="--with-slepc4py=1 --have-petsc4py=1" Satish On Thu, 17 Nov 2022, Pierre Jolivet wrote: > That?s a fun one. > BAMG is built after PETSc and SLEPc. > But slepc4py tries to import PETSc before BAMG has been built, so it?s chocking. > Could you please try to change --download-slepc-configure-arguments=--with-slepc4py=1 to --download-slepc-configure-arguments=--with-slepc4py=1\ --have-petsc4py=1 > > Thanks, > Pierre > > > On 17 Nov 2022, at 3:19 PM, Zongze Yang wrote: > > > > Hello, I tried to build petsc with slepc. `make` give the following error information. How can I figure out the problem? The configure.log and make.log are attached. > > > > ``` > > *** Building SLEPc *** > > Checking environment... done > > Checking PETSc installation... done > > Processing slepc4py... [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > > [0]PETSC ERROR: Unable to open file > > [0]PETSC ERROR: Unable to locate PETSc BAMG dynamic library > > You cannot move the dynamic libraries! > > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > > [0]PETSC ERROR: Petsc Development GIT revision: v3.18.1-291-g89fba64eb45 GIT Date: 2022-11-15 14:31:39 +0000 > > [0]PETSC ERROR: Unknown Name on a arch-main-debug named ws6 by z2yang Thu Nov 17 21:59:09 2022 > > [0]PETSC ERROR: Configure options PETSC_ARCH=arch-main-debug --download-bamg --download-bison --download-chaco --download-ctetgen --download-egads --download-eigen --download-exodusii --download-fftw --download-hpddm --download-ks --download-libceed --download-metis --download-ml --download-mmg --download-mumps --download-netcdf --download-opencascade --download-p4est --download-parmetis --download-parmmg --download-pnetcdf --download-pragmatic --download-ptscotch --download-scalapack --download-slepc --download-slepc-configure-arguments=--with-slepc4py=1 --download-suitesparse --download-superlu_dist --download-tetgen --download-triangle --download-cmake --download-hdf5 --download-mpich --download-mpi4py --download-slepc --download-zlib --download-libpng --download-muparser --with-petsc4py=1 --with-shared-libraries --with-x=1 --with-x-include="[/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libx11-1.7.0-5c4ah77x6u7zfm6msg6hbkt23vmwjgkz/include, /home/z2 yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/xproto-7.0.31-z33ate5bew7b7xrpj3pv6nb3towcfimo/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/kbproto-1.0.7-ea2l5e2kp43i2b423eotqxseywjvqis6/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxcb-1.14-e2ea2x3zga5xipq5wvcgsw25ilq5yo63/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxau-1.0.8-gmwxeffxcbkmxvwawtndhutiwficmxwv/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxdmcp-1.1.2-bsggzn5pf6pu5guwbooi3riu5uhaqgee/include]" --with-x-lib="-L/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libx11-1.7.0-5c4ah77x6u7zfm6msg6hbkt23vmwjgkz/lib -lX11" --force > > [0]PETSC ERROR: #1 PetscInitialize_DynamicLibraries() at /home/z2yang/repos/petsc/src/sys/dll/reg.c:135 > > [0]PETSC ERROR: #2 PetscInitialize_Common() at /home/z2yang/repos/petsc/src/sys/objects/pinit.c:1025 > > [0]PETSC ERROR: #3 PetscInitialize() at /home/z2yang/repos/petsc/src/sys/objects/pinit.c:1267 > > Traceback (most recent call last): > > File "/home/z2yang/repos/petsc/arch-main-debug/externalpackages/git.slepc/./configure", line 11, in > > exec(open(os.path.join(os.path.dirname(__file__), 'config', 'configure.py')).read()) > > File "", line 215, in > > File "/home/z2yang/repos/petsc/arch-main-debug/externalpackages/git.slepc/config/packages/slepc4py.py", line 53, in Process > > from petsc4py import PETSc > > File "/home/z2yang/repos/petsc/arch-main-debug/lib/petsc4py/PETSc.py", line 4, in > > PETSc._initialize() > > File "PETSc/PETSc.pyx", line 509, in petsc4py.PETSc._initialize > > File "PETSc/PETSc.pyx", line 402, in petsc4py.PETSc.initialize > > petsc4py.PETSc.Error: error code 65 > > **************************ERROR************************************* > > Error building SLEPc. > > ******************************************************************** > > gmake[2]: *** [/home/z2yang/repos/petsc/arch-main-debug/lib/petsc/conf/petscrules:29: slepcbuild] Error 1 > > **************************ERROR************************************* > > Error during compile, check arch-main-debug/lib/petsc/conf/make.log > > Send it and arch-main-debug/lib/petsc/conf/configure.log to petsc-maint at mcs.anl.gov > > ******************************************************************** > > Finishing make run at Thu, 17 Nov 2022 21:59:09 +0800 > > ``` > > > > From pierre at joliv.et Thu Nov 17 09:23:39 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Thu, 17 Nov 2022 16:23:39 +0100 Subject: [petsc-users] Build error with slepc: Unable to locate PETSc BAMG dynamic library In-Reply-To: References: Message-ID: > On 17 Nov 2022, at 4:11 PM, Satish Balay via petsc-users wrote: > >> --download-bamg --download-slepc --download-slepc-configure-arguments=--with-slepc4py=1 > > I guess this won't really work It does work. Just tried ./configure --download-slepc --download-bamg --with-petsc4py '--download-slepc-configure-arguments=--with-slepc4py=1 --have-petsc4py=1' --with-fc=0 No issue whatsoever. Matt, you should probably force that flag (--have-petsc4py=1) in bamg.py (and you should change '+carg+'./configure to '+carg+self.python.pyexe+' ./configure as in slepc.py) Thanks, Pierre > as the order of build should be: > - petsc > - slepc > - bamg > - slepc4py > > And its not easy to do this via configure -without hacks. Currently the above build has the order (hence fails): > - petsc > - slepc > - slepc4py > - bamg > > I guess the alternative is: build slepc4py separately after petsc/slepc/bamg are built. > > Satish > > On Thu, 17 Nov 2022, Zongze Yang wrote: > >> Hello, I tried to build petsc with slepc. `make` give the following error >> information. How can I figure out the problem? The configure.log and >> make.log are attached. >> >> ``` >> *** Building SLEPc *** >> Checking environment... done >> Checking PETSc installation... done >> Processing slepc4py... [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> [0]PETSC ERROR: Unable to open file >> [0]PETSC ERROR: Unable to locate PETSc BAMG dynamic library >> You cannot move the dynamic libraries! >> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. >> [0]PETSC ERROR: Petsc Development GIT revision: v3.18.1-291-g89fba64eb45 >> GIT Date: 2022-11-15 14:31:39 +0000 >> [0]PETSC ERROR: Unknown Name on a arch-main-debug named ws6 by z2yang Thu >> Nov 17 21:59:09 2022 >> [0]PETSC ERROR: Configure options PETSC_ARCH=arch-main-debug >> --download-bamg --download-bison --download-chaco --download-ctetgen >> --download-egads --download-eigen --download-exodusii --download-fftw >> --download-hpddm --download-ks --download-libceed --download-metis >> --download-ml --download-mmg --download-mumps --download-netcdf >> --download-opencascade --download-p4est --download-parmetis >> --download-parmmg --download-pnetcdf --download-pragmatic >> --download-ptscotch --download-scalapack --download-slepc >> --download-slepc-configure-arguments=--with-slepc4py=1 >> --download-suitesparse --download-superlu_dist --download-tetgen >> --download-triangle --download-cmake --download-hdf5 --download-mpich >> --download-mpi4py --download-slepc --download-zlib --download-libpng >> --download-muparser --with-petsc4py=1 --with-shared-libraries --with-x=1 >> --with-x-include="[/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libx11-1.7.0-5c4ah77x6u7zfm6msg6hbkt23vmwjgkz/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/xproto-7.0.31-z33ate5bew7b7xrpj3pv6nb3towcfimo/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/kbproto-1.0.7-ea2l5e2kp43i2b423eotqxseywjvqis6/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxcb-1.14-e2ea2x3zga5xipq5wvcgsw25ilq5yo63/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxau-1.0.8-gmwxeffxcbkmxvwawtndhutiwficmxwv/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxdmcp-1.1.2-bsggzn5pf6pu5guwbooi3riu5uhaqgee/include]" >> --with-x-lib="-L/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libx11-1.7.0-5c4ah77x6u7zfm6msg6hbkt23vmwjgkz/lib >> -lX11" --force >> [0]PETSC ERROR: #1 PetscInitialize_DynamicLibraries() at >> /home/z2yang/repos/petsc/src/sys/dll/reg.c:135 >> [0]PETSC ERROR: #2 PetscInitialize_Common() at >> /home/z2yang/repos/petsc/src/sys/objects/pinit.c:1025 >> [0]PETSC ERROR: #3 PetscInitialize() at >> /home/z2yang/repos/petsc/src/sys/objects/pinit.c:1267 >> Traceback (most recent call last): >> File >> "/home/z2yang/repos/petsc/arch-main-debug/externalpackages/git.slepc/./configure", >> line 11, in >> exec(open(os.path.join(os.path.dirname(__file__), 'config', >> 'configure.py')).read()) >> File "", line 215, in >> File >> "/home/z2yang/repos/petsc/arch-main-debug/externalpackages/git.slepc/config/packages/slepc4py.py", >> line 53, in Process >> from petsc4py import PETSc >> File "/home/z2yang/repos/petsc/arch-main-debug/lib/petsc4py/PETSc.py", >> line 4, in >> PETSc._initialize() >> File "PETSc/PETSc.pyx", line 509, in petsc4py.PETSc._initialize >> File "PETSc/PETSc.pyx", line 402, in petsc4py.PETSc.initialize >> petsc4py.PETSc.Error: error code 65 >> **************************ERROR************************************* >> Error building SLEPc. >> ******************************************************************** >> gmake[2]: *** >> [/home/z2yang/repos/petsc/arch-main-debug/lib/petsc/conf/petscrules:29: >> slepcbuild] Error 1 >> **************************ERROR************************************* >> Error during compile, check arch-main-debug/lib/petsc/conf/make.log >> Send it and arch-main-debug/lib/petsc/conf/configure.log to >> petsc-maint at mcs.anl.gov >> ******************************************************************** >> Finishing make run at Thu, 17 Nov 2022 21:59:09 +0800 >> ``` >> > From yangzongze at gmail.com Thu Nov 17 09:54:16 2022 From: yangzongze at gmail.com (Zongze Yang) Date: Thu, 17 Nov 2022 23:54:16 +0800 Subject: [petsc-users] Build error with slepc: Unable to locate PETSc BAMG dynamic library In-Reply-To: References: Message-ID: Thanks for all the suggestions. Will try to build with ` --download-slepc-configure-arguments='--with-slepc4py=1 --have-petsc4py=1'`. Thanks, Zongze Pierre Jolivet ?2022?11?17??? 23:23??? > > > > On 17 Nov 2022, at 4:11 PM, Satish Balay via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > > >> --download-bamg --download-slepc > --download-slepc-configure-arguments=--with-slepc4py=1 > > > > I guess this won't really work > > It does work. > Just tried ./configure --download-slepc --download-bamg --with-petsc4py > '--download-slepc-configure-arguments=--with-slepc4py=1 --have-petsc4py=1' > --with-fc=0 > No issue whatsoever. > Matt, you should probably force that flag (--have-petsc4py=1) in bamg.py > (and you should change '+carg+'./configure to '+carg+self.python.pyexe+' > ./configure as in slepc.py) > > Thanks, > Pierre > > > as the order of build should be: > > - petsc > > - slepc > > - bamg > > - slepc4py > > > > And its not easy to do this via configure -without hacks. Currently the > above build has the order (hence fails): > > - petsc > > - slepc > > - slepc4py > > - bamg > > > > I guess the alternative is: build slepc4py separately after > petsc/slepc/bamg are built. > > > > Satish > > > > On Thu, 17 Nov 2022, Zongze Yang wrote: > > > >> Hello, I tried to build petsc with slepc. `make` give the following > error > >> information. How can I figure out the problem? The configure.log and > >> make.log are attached. > >> > >> ``` > >> *** Building SLEPc *** > >> Checking environment... done > >> Checking PETSc installation... done > >> Processing slepc4py... [0]PETSC ERROR: --------------------- Error > Message > >> -------------------------------------------------------------- > >> [0]PETSC ERROR: Unable to open file > >> [0]PETSC ERROR: Unable to locate PETSc BAMG dynamic library > >> You cannot move the dynamic libraries! > >> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble > shooting. > >> [0]PETSC ERROR: Petsc Development GIT revision: v3.18.1-291-g89fba64eb45 > >> GIT Date: 2022-11-15 14:31:39 +0000 > >> [0]PETSC ERROR: Unknown Name on a arch-main-debug named ws6 by z2yang > Thu > >> Nov 17 21:59:09 2022 > >> [0]PETSC ERROR: Configure options PETSC_ARCH=arch-main-debug > >> --download-bamg --download-bison --download-chaco --download-ctetgen > >> --download-egads --download-eigen --download-exodusii --download-fftw > >> --download-hpddm --download-ks --download-libceed --download-metis > >> --download-ml --download-mmg --download-mumps --download-netcdf > >> --download-opencascade --download-p4est --download-parmetis > >> --download-parmmg --download-pnetcdf --download-pragmatic > >> --download-ptscotch --download-scalapack --download-slepc > >> --download-slepc-configure-arguments=--with-slepc4py=1 > >> --download-suitesparse --download-superlu_dist --download-tetgen > >> --download-triangle --download-cmake --download-hdf5 --download-mpich > >> --download-mpi4py --download-slepc --download-zlib --download-libpng > >> --download-muparser --with-petsc4py=1 --with-shared-libraries --with-x=1 > >> > --with-x-include="[/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libx11-1.7.0-5c4ah77x6u7zfm6msg6hbkt23vmwjgkz/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/xproto-7.0.31-z33ate5bew7b7xrpj3pv6nb3towcfimo/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/kbproto-1.0.7-ea2l5e2kp43i2b423eotqxseywjvqis6/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxcb-1.14-e2ea2x3zga5xipq5wvcgsw25ilq5yo63/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxau-1.0.8-gmwxeffxcbkmxvwawtndhutiwficmxwv/include,/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libxdmcp-1.1.2-bsggzn5pf6pu5guwbooi3riu5uhaqgee/include]" > >> > --with-x-lib="-L/home/z2yang/opt/spack/opt/spack/linux-ubuntu22.04-cascadelake/gcc-11.2.0/libx11-1.7.0-5c4ah77x6u7zfm6msg6hbkt23vmwjgkz/lib > >> -lX11" --force > >> [0]PETSC ERROR: #1 PetscInitialize_DynamicLibraries() at > >> /home/z2yang/repos/petsc/src/sys/dll/reg.c:135 > >> [0]PETSC ERROR: #2 PetscInitialize_Common() at > >> /home/z2yang/repos/petsc/src/sys/objects/pinit.c:1025 > >> [0]PETSC ERROR: #3 PetscInitialize() at > >> /home/z2yang/repos/petsc/src/sys/objects/pinit.c:1267 > >> Traceback (most recent call last): > >> File > >> > "/home/z2yang/repos/petsc/arch-main-debug/externalpackages/git.slepc/./configure", > >> line 11, in > >> exec(open(os.path.join(os.path.dirname(__file__), 'config', > >> 'configure.py')).read()) > >> File "", line 215, in > >> File > >> > "/home/z2yang/repos/petsc/arch-main-debug/externalpackages/git.slepc/config/packages/slepc4py.py", > >> line 53, in Process > >> from petsc4py import PETSc > >> File "/home/z2yang/repos/petsc/arch-main-debug/lib/petsc4py/PETSc.py", > >> line 4, in > >> PETSc._initialize() > >> File "PETSc/PETSc.pyx", line 509, in petsc4py.PETSc._initialize > >> File "PETSc/PETSc.pyx", line 402, in petsc4py.PETSc.initialize > >> petsc4py.PETSc.Error: error code 65 > >> **************************ERROR************************************* > >> Error building SLEPc. > >> ******************************************************************** > >> gmake[2]: *** > >> [/home/z2yang/repos/petsc/arch-main-debug/lib/petsc/conf/petscrules:29: > >> slepcbuild] Error 1 > >> **************************ERROR************************************* > >> Error during compile, check arch-main-debug/lib/petsc/conf/make.log > >> Send it and arch-main-debug/lib/petsc/conf/configure.log to > >> petsc-maint at mcs.anl.gov > >> ******************************************************************** > >> Finishing make run at Thu, 17 Nov 2022 21:59:09 +0800 > >> ``` > >> > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From karthikeyan.chockalingam at stfc.ac.uk Thu Nov 17 10:13:25 2022 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Thu, 17 Nov 2022 16:13:25 +0000 Subject: [petsc-users] Different solution while running in parallel In-Reply-To: References: Message-ID: Hi Matt, I tested two sizes manually for the Poisson problem with homogenous Dirichlet boundary conditions (on all nodes on the boundary) and they both produced the right result when run serially using PCLU 1. 2 elements x 2 elements (total nodes 9 but 1 dof) A = 10.6667 b = 4 x = 0.375 1. 3 elements x 3 elements (total nodes 16 but 4 dof) A = 10.6667 -1.33333 -1.33333 -1.33333 -1.33333 10.6667 -1.33333 -1.33333 -1.33333 -1.33333 10.6667 -1.33333 -1.33333 -1.33333 -1.33333 10.6667 b = {4 4 4 4}^T x = (0.6 0.6 0.6 0.6) Since, it is solvable not sure if the system can be singular? I have attached the runs for case (2) run on one and two cores. Parallel run produces zero vector for x. I used MatZeroRowsColumns to set the Dirichlet boundary conditions by zeroing the entries in the matrix corresponding to the boundary nodes. Best, Karthik. From: Matthew Knepley Date: Thursday, 17 November 2022 at 15:16 To: Chockalingam, Karthikeyan (STFC,DL,HC) Cc: Zhang, Hong , petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Different solution while running in parallel Using options instead of code will make your life much easier. Two thing are wrong here: 1) Your solver is doing no iterates because the initial residual is very small, 5.493080158227e-15. The LU does not matter. In order to check the condition number of your system, run with -pc_type svd -pc_svd_monitor 2) Your parallel run also does no iterates 0 KSP none resid norm 6.951601853367e-310 true resid norm 1.058300524426e+01 ||r(i)||/||b|| 8.819171036882e-01 but the true residual is not small. That means that your system is singular, but you have given a consistent RHS. Thanks, Matt On Thu, Nov 17, 2022 at 9:37 AM Karthikeyan Chockalingam - STFC UKRI > wrote: Hi Matt and Hong, Thank you for your response. I made the following changes, to get the desired output PetscReal norm; /* norm of solution error */ PetscInt its; KSPConvergedReason reason; PetscViewerAndFormat *vf; PetscViewerAndFormatCreate(PETSC_VIEWER_STDOUT_WORLD, PETSC_VIEWER_DEFAULT, &vf); ierr = KSPView(ksp, PETSC_VIEWER_STDOUT_SELF);CHKERRQ(ierr); KSPSolve(ksp, b, x); ierr = KSPMonitorTrueResidual(ksp,its,norm,vf);CHKERRQ(ierr); ierr = KSPMonitorSingularValue(ksp, its, norm, vf);CHKERRQ(ierr); I have attached the outputs from both the runs. As before, I am also printing A, b, and x. I wonder if it is a memory issue related to mpi library employed. I am currently using openmpi ? should I instead use mpich? Kind regards, Karthik. From: Matthew Knepley > Date: Thursday, 17 November 2022 at 12:19 To: Zhang, Hong > Cc: petsc-users at mcs.anl.gov >, Chockalingam, Karthikeyan (STFC,DL,HC) > Subject: Re: [petsc-users] Different solution while running in parallel On Wed, Nov 16, 2022 at 9:07 PM Zhang, Hong via petsc-users > wrote: Karhik, Can you find out the condition number of your matrix? Also, run using -ksp_view -ksp_monitor_true_residual -ksp_converged_reason and send the two outputs. Thanks, Matt Hong ________________________________ From: petsc-users > on behalf of Karthikeyan Chockalingam - STFC UKRI via petsc-users > Sent: Wednesday, November 16, 2022 6:04 PM To: petsc-users at mcs.anl.gov > Subject: [petsc-users] Different solution while running in parallel Hello, I tried to solve a (FE discretized) Poisson equation using PCLU. For some reason I am getting different solutions while running the problem on one and two cores. I have attached the output file (out.txt) from both the runs. I am printing A, b and x from both the runs ? while A and b are the same but the solution seems is different. I am not sure what I doing wrong. Below is my matrix, vector, and solve setup. Mat A; Vec b, x; ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr); ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr); ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ(ierr); ierr = MatMPIAIJSetPreallocation(A,d_nz, NULL, o_nz, NULL); CHKERRQ(ierr); ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE); CHKERRQ(ierr); ierr = MatCreateVecs(A, &b, &x); CHKERRQ(ierr); KSP ksp; PC pc; KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetOperators(ksp, A, A); ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr); ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); ierr = PCSetType(pc,PCLU);CHKERRQ(ierr); ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr); KSPSolve(ksp, b, x); Thank you for your help. Karhik. This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: 2_MPI_Size4x4.txt URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: 1_MPI_Size4x4.txt URL: From knepley at gmail.com Thu Nov 17 11:48:16 2022 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 17 Nov 2022 12:48:16 -0500 Subject: [petsc-users] Different solution while running in parallel In-Reply-To: References: Message-ID: On Thu, Nov 17, 2022 at 11:13 AM Karthikeyan Chockalingam - STFC UKRI < karthikeyan.chockalingam at stfc.ac.uk> wrote: > Hi Matt, > > > > I tested two sizes manually for the Poisson problem with homogenous > Dirichlet boundary conditions (on all nodes on the boundary) and they both > produced the right result when run serially using PCLU > > > > 1. 2 elements x 2 elements (total nodes 9 but 1 dof) > > A = 10.6667 b = 4 x = 0.375 > > 1. 3 elements x 3 elements (total nodes 16 but 4 dof) > > A = 10.6667 -1.33333 -1.33333 -1.33333 > > -1.33333 10.6667 -1.33333 -1.33333 > > -1.33333 -1.33333 10.6667 -1.33333 > > -1.33333 -1.33333 -1.33333 10.6667 > > > > b = {4 4 4 4}^T > > x = (0.6 0.6 0.6 0.6) > > > > Since, it is solvable not sure if the system can be singular? > > > > I have attached the runs for case (2) run on one and two cores. Parallel > run produces zero vector for x. > > > > I used MatZeroRowsColumns to set the Dirichlet boundary conditions by > zeroing the entries in the matrix corresponding to the boundary nodes. > > > Please please please run the original thing with the options I suggested: -pc_type svd -pc_svd_monitor This will print out all the singular values of the matrix and solve it using SVD. Thanks, Matt > Best, > > Karthik. > > > > > > > > *From: *Matthew Knepley > *Date: *Thursday, 17 November 2022 at 15:16 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *Zhang, Hong , petsc-users at mcs.anl.gov < > petsc-users at mcs.anl.gov> > *Subject: *Re: [petsc-users] Different solution while running in parallel > > Using options instead of code will make your life much easier. > > > > Two thing are wrong here: > > > > 1) Your solver is doing no iterates because the initial residual is very > small, 5.493080158227e-15. The LU does not matter. > > In order to check the condition number of your system, run with > -pc_type svd -pc_svd_monitor > > > > 2) Your parallel run also does no iterates > > > > 0 KSP none resid norm 6.951601853367e-310 true resid norm > 1.058300524426e+01 ||r(i)||/||b|| 8.819171036882e-01 > > > > but the true residual is not small. That means that your system is > singular, but you have given a consistent RHS. > > > > Thanks, > > > > Matt > > > > On Thu, Nov 17, 2022 at 9:37 AM Karthikeyan Chockalingam - STFC UKRI < > karthikeyan.chockalingam at stfc.ac.uk> wrote: > > Hi Matt and Hong, > > > > Thank you for your response. > > I made the following changes, to get the desired output > > > > PetscReal norm; /* norm of solution error */ > > PetscInt its; > > KSPConvergedReason reason; > > PetscViewerAndFormat *vf; > > PetscViewerAndFormatCreate(PETSC_VIEWER_STDOUT_WORLD, > PETSC_VIEWER_DEFAULT, &vf); > > > > ierr = KSPView(ksp, PETSC_VIEWER_STDOUT_SELF);CHKERRQ(ierr); > > > > KSPSolve(ksp, b, x); > > > > ierr = KSPMonitorTrueResidual(ksp,its,norm,vf);CHKERRQ(ierr); > > ierr = KSPMonitorSingularValue(ksp, its, norm, vf);CHKERRQ(ierr); > > > > I have attached the outputs from both the runs. As before, I am also > printing A, b, and x. > > > > I wonder if it is a memory issue related to mpi library employed. I am > currently using openmpi ? should I instead use mpich? > > > > Kind regards, > > Karthik. > > > > *From: *Matthew Knepley > *Date: *Thursday, 17 November 2022 at 12:19 > *To: *Zhang, Hong > *Cc: *petsc-users at mcs.anl.gov , Chockalingam, > Karthikeyan (STFC,DL,HC) > *Subject: *Re: [petsc-users] Different solution while running in parallel > > On Wed, Nov 16, 2022 at 9:07 PM Zhang, Hong via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > Karhik, > > Can you find out the condition number of your matrix? > > > > Also, run using > > > > -ksp_view -ksp_monitor_true_residual -ksp_converged_reason > > > > and send the two outputs. > > > > Thanks, > > > > Matt > > > > Hong > > > ------------------------------ > > *From:* petsc-users on behalf of > Karthikeyan Chockalingam - STFC UKRI via petsc-users < > petsc-users at mcs.anl.gov> > *Sent:* Wednesday, November 16, 2022 6:04 PM > *To:* petsc-users at mcs.anl.gov > *Subject:* [petsc-users] Different solution while running in parallel > > > > Hello, > > > > I tried to solve a (FE discretized) Poisson equation using PCLU. For > some reason I am getting different solutions while running the problem on > one and two cores. I have attached the output file (out.txt) from both the > runs. I am printing A, b and x from both the runs ? while A and b are the > same but the solution seems is different. > > > > I am not sure what I doing wrong. > > > > Below is my matrix, vector, and solve setup. > > > > > > Mat A; > > Vec b, x; > > > > ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr); > > ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr); > > ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ > (ierr); > > ierr = MatMPIAIJSetPreallocation(A,d_nz, *NULL*, o_nz, *NULL*); > CHKERRQ(ierr); > > ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE); CHKERRQ(ierr); > > ierr = MatCreateVecs(A, &b, &x); CHKERRQ(ierr); > > > > KSP ksp; > > PC pc; > > KSPCreate(PETSC_COMM_WORLD, &ksp); > > KSPSetOperators(ksp, A, A); > > ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr); > > ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); > > ierr = PCSetType(pc,PCLU);CHKERRQ(ierr); > > ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr); > > KSPSolve(ksp, b, x); > > > > Thank you for your help. > > > > Karhik. > > > > This email and any attachments are intended solely for the use of the > named recipients. If you are not the intended recipient you must not use, > disclose, copy or distribute this email or any of its attachments and > should notify the sender immediately and delete this email from your > system. UK Research and Innovation (UKRI) has taken every reasonable > precaution to minimise risk of this email or any attachments containing > viruses or malware but the recipient should carry out its own virus and > malware checks before opening the attachments. UKRI does not accept any > liability for any losses or damages which the recipient may sustain due to > presence of any viruses. > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From karthikeyan.chockalingam at stfc.ac.uk Thu Nov 17 13:34:32 2022 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Thu, 17 Nov 2022 19:34:32 +0000 Subject: [petsc-users] Different solution while running in parallel In-Reply-To: References: Message-ID: I have tried and not sure how to set "-pc_svd_monitor? since, I have not yet setup the command line option. (I am currently using PETSc within another framework). KSP ksp; PC pc; KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetOperators(ksp, A, A); ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr); ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); ierr = PCSetType(pc,PCSVD);CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-pc_svd_monitor", NULL); CHKERRQ(ierr); ?-- is this right? ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr); ierr = KSPView(ksp, PETSC_VIEWER_STDOUT_SELF);CHKERRQ(ierr); KSPSolve(ksp, b, x); ierr = KSPMonitorTrueResidual(ksp,its,norm,vf);CHKERRQ(ierr); ierr = KSPMonitorSingularValue(ksp, its, norm, vf);CHKERRQ(ierr); Yes, now using PCSVD both the serial and parallel version produce the same result. (i) What does this imply? (ii) Would I be able to solve using CG preconditioned using Hypre as I scale the problem? (iii) I have not built PETSc with SLEPc ? can I still use PCSVD? (iv) Can I set ksp type, pc type, ksp monitor etc using PETScOptionsSetValue instead of code? In that case how would the above code translate to? That will be very helpful. Many thanks. Best, Karthik. From: Matthew Knepley Date: Thursday, 17 November 2022 at 17:48 To: Chockalingam, Karthikeyan (STFC,DL,HC) Cc: Zhang, Hong , petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Different solution while running in parallel On Thu, Nov 17, 2022 at 11:13 AM Karthikeyan Chockalingam - STFC UKRI > wrote: Hi Matt, I tested two sizes manually for the Poisson problem with homogenous Dirichlet boundary conditions (on all nodes on the boundary) and they both produced the right result when run serially using PCLU 1. 2 elements x 2 elements (total nodes 9 but 1 dof) A = 10.6667 b = 4 x = 0.375 1. 3 elements x 3 elements (total nodes 16 but 4 dof) A = 10.6667 -1.33333 -1.33333 -1.33333 -1.33333 10.6667 -1.33333 -1.33333 -1.33333 -1.33333 10.6667 -1.33333 -1.33333 -1.33333 -1.33333 10.6667 b = {4 4 4 4}^T x = (0.6 0.6 0.6 0.6) Since, it is solvable not sure if the system can be singular? I have attached the runs for case (2) run on one and two cores. Parallel run produces zero vector for x. I used MatZeroRowsColumns to set the Dirichlet boundary conditions by zeroing the entries in the matrix corresponding to the boundary nodes. Please please please run the original thing with the options I suggested: -pc_type svd -pc_svd_monitor This will print out all the singular values of the matrix and solve it using SVD. Thanks, Matt Best, Karthik. From: Matthew Knepley > Date: Thursday, 17 November 2022 at 15:16 To: Chockalingam, Karthikeyan (STFC,DL,HC) > Cc: Zhang, Hong >, petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Different solution while running in parallel Using options instead of code will make your life much easier. Two thing are wrong here: 1) Your solver is doing no iterates because the initial residual is very small, 5.493080158227e-15. The LU does not matter. In order to check the condition number of your system, run with -pc_type svd -pc_svd_monitor 2) Your parallel run also does no iterates 0 KSP none resid norm 6.951601853367e-310 true resid norm 1.058300524426e+01 ||r(i)||/||b|| 8.819171036882e-01 but the true residual is not small. That means that your system is singular, but you have given a consistent RHS. Thanks, Matt On Thu, Nov 17, 2022 at 9:37 AM Karthikeyan Chockalingam - STFC UKRI > wrote: Hi Matt and Hong, Thank you for your response. I made the following changes, to get the desired output PetscReal norm; /* norm of solution error */ PetscInt its; KSPConvergedReason reason; PetscViewerAndFormat *vf; PetscViewerAndFormatCreate(PETSC_VIEWER_STDOUT_WORLD, PETSC_VIEWER_DEFAULT, &vf); ierr = KSPView(ksp, PETSC_VIEWER_STDOUT_SELF);CHKERRQ(ierr); KSPSolve(ksp, b, x); ierr = KSPMonitorTrueResidual(ksp,its,norm,vf);CHKERRQ(ierr); ierr = KSPMonitorSingularValue(ksp, its, norm, vf);CHKERRQ(ierr); I have attached the outputs from both the runs. As before, I am also printing A, b, and x. I wonder if it is a memory issue related to mpi library employed. I am currently using openmpi ? should I instead use mpich? Kind regards, Karthik. From: Matthew Knepley > Date: Thursday, 17 November 2022 at 12:19 To: Zhang, Hong > Cc: petsc-users at mcs.anl.gov >, Chockalingam, Karthikeyan (STFC,DL,HC) > Subject: Re: [petsc-users] Different solution while running in parallel On Wed, Nov 16, 2022 at 9:07 PM Zhang, Hong via petsc-users > wrote: Karhik, Can you find out the condition number of your matrix? Also, run using -ksp_view -ksp_monitor_true_residual -ksp_converged_reason and send the two outputs. Thanks, Matt Hong ________________________________ From: petsc-users > on behalf of Karthikeyan Chockalingam - STFC UKRI via petsc-users > Sent: Wednesday, November 16, 2022 6:04 PM To: petsc-users at mcs.anl.gov > Subject: [petsc-users] Different solution while running in parallel Hello, I tried to solve a (FE discretized) Poisson equation using PCLU. For some reason I am getting different solutions while running the problem on one and two cores. I have attached the output file (out.txt) from both the runs. I am printing A, b and x from both the runs ? while A and b are the same but the solution seems is different. I am not sure what I doing wrong. Below is my matrix, vector, and solve setup. Mat A; Vec b, x; ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr); ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr); ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ(ierr); ierr = MatMPIAIJSetPreallocation(A,d_nz, NULL, o_nz, NULL); CHKERRQ(ierr); ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE); CHKERRQ(ierr); ierr = MatCreateVecs(A, &b, &x); CHKERRQ(ierr); KSP ksp; PC pc; KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetOperators(ksp, A, A); ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr); ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); ierr = PCSetType(pc,PCLU);CHKERRQ(ierr); ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr); KSPSolve(ksp, b, x); Thank you for your help. Karhik. This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From karthikeyan.chockalingam at stfc.ac.uk Fri Nov 18 07:31:57 2022 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Fri, 18 Nov 2022 13:31:57 +0000 Subject: [petsc-users] PetscOptionsSetValue syntax Message-ID: Hello, I would like to move from code to using options. KSP ksp; PC pc; KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetOperators(ksp, A, A); I tried using PetscOptionsSetValue as below ierr = PetscOptionsSetValue(NULL,"-ksp_type", "cg"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-pc_type", "jacobi"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-ksp_monitor", NULL);CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-ksp_view", NULL);CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-ksp_monitor_true_residual", NULL);CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-ksp_converged_reason", NULL);CHKERRQ(ierr); KSPSolve(ksp, b, x); But I believe none of the string flags where picked up by PETSc. What am I missing? Thank you for your help. Kind regards, Karthik. This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses. -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre at joliv.et Fri Nov 18 07:36:02 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Fri, 18 Nov 2022 14:36:02 +0100 Subject: [petsc-users] PetscOptionsSetValue syntax In-Reply-To: References: Message-ID: <3C648B49-131F-4058-AA66-3B934D121394@joliv.et> > On 18 Nov 2022, at 2:32 PM, Karthikeyan Chockalingam - STFC UKRI via petsc-users wrote: > > ? > Hello, > > I would like to move from code to using options. > > KSP ksp; > PC pc; > KSPCreate(PETSC_COMM_WORLD, &ksp); > KSPSetOperators(ksp, A, A); > > I tried using PetscOptionsSetValue as below > > ierr = PetscOptionsSetValue(NULL,"-ksp_type", "cg"); CHKERRQ(ierr); > ierr = PetscOptionsSetValue(NULL,"-pc_type", "jacobi"); CHKERRQ(ierr); > ierr = PetscOptionsSetValue(NULL, "-ksp_monitor", NULL);CHKERRQ(ierr); > ierr = PetscOptionsSetValue(NULL, "-ksp_view", NULL);CHKERRQ(ierr); > ierr = PetscOptionsSetValue(NULL, "-ksp_monitor_true_residual", NULL);CHKERRQ(ierr); > ierr = PetscOptionsSetValue(NULL, "-ksp_converged_reason", NULL);CHKERRQ(ierr); > > KSPSolve(ksp, b, x); > > But I believe none of the string flags where picked up by PETSc. What am I missing? KSPSetFromOptions() Thanks, Pierre > Thank you for your help. > > Kind regards, > Karthik. > This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses. -------------- next part -------------- An HTML attachment was scrubbed... URL: From karthikeyan.chockalingam at stfc.ac.uk Fri Nov 18 07:47:06 2022 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Fri, 18 Nov 2022 13:47:06 +0000 Subject: [petsc-users] PetscOptionsSetValue syntax In-Reply-To: <3C648B49-131F-4058-AA66-3B934D121394@joliv.et> References: <3C648B49-131F-4058-AA66-3B934D121394@joliv.et> Message-ID: Thank you. I added ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr); before PetscOptionsSetValue but still I don?t see the ksp iterations or any other info flags I set. I am not passing anything via the command line and believe PetscOptionsSetValue would set the flags. Best, Karthik. From: Pierre Jolivet Date: Friday, 18 November 2022 at 13:36 To: Chockalingam, Karthikeyan (STFC,DL,HC) Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] PetscOptionsSetValue syntax On 18 Nov 2022, at 2:32 PM, Karthikeyan Chockalingam - STFC UKRI via petsc-users wrote: ? Hello, I would like to move from code to using options. KSP ksp; PC pc; KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetOperators(ksp, A, A); I tried using PetscOptionsSetValue as below ierr = PetscOptionsSetValue(NULL,"-ksp_type", "cg"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-pc_type", "jacobi"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-ksp_monitor", NULL);CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-ksp_view", NULL);CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-ksp_monitor_true_residual", NULL);CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-ksp_converged_reason", NULL);CHKERRQ(ierr); KSPSolve(ksp, b, x); But I believe none of the string flags where picked up by PETSc. What am I missing? KSPSetFromOptions() Thanks, Pierre Thank you for your help. Kind regards, Karthik. This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses. -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre at joliv.et Fri Nov 18 07:55:53 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Fri, 18 Nov 2022 14:55:53 +0100 Subject: [petsc-users] PetscOptionsSetValue syntax In-Reply-To: References: Message-ID: > On 18 Nov 2022, at 2:47 PM, Karthikeyan Chockalingam - STFC UKRI wrote: > > ? > Thank you. I added > > ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr); > > before PetscOptionsSetValue but still I don?t see the ksp iterations or any other info flags I set. > I am not passing anything via the command line and believe PetscOptionsSetValue would set the flags. If you call KSPSetFromOptions() before you set any options, there is nothing to set. Usually, you want to delay calling that function up until you are done setting options. Thanks, Pierre > Best, > Karthik. > > > From: Pierre Jolivet > Date: Friday, 18 November 2022 at 13:36 > To: Chockalingam, Karthikeyan (STFC,DL,HC) > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] PetscOptionsSetValue syntax > > > > > On 18 Nov 2022, at 2:32 PM, Karthikeyan Chockalingam - STFC UKRI via petsc-users wrote: > > ? > Hello, > > I would like to move from code to using options. > > KSP ksp; > PC pc; > KSPCreate(PETSC_COMM_WORLD, &ksp); > KSPSetOperators(ksp, A, A); > > I tried using PetscOptionsSetValue as below > > ierr = PetscOptionsSetValue(NULL,"-ksp_type", "cg"); CHKERRQ(ierr); > ierr = PetscOptionsSetValue(NULL,"-pc_type", "jacobi"); CHKERRQ(ierr); > ierr = PetscOptionsSetValue(NULL, "-ksp_monitor", NULL);CHKERRQ(ierr); > ierr = PetscOptionsSetValue(NULL, "-ksp_view", NULL);CHKERRQ(ierr); > ierr = PetscOptionsSetValue(NULL, "-ksp_monitor_true_residual", NULL);CHKERRQ(ierr); > ierr = PetscOptionsSetValue(NULL, "-ksp_converged_reason", NULL);CHKERRQ(ierr); > > KSPSolve(ksp, b, x); > > But I believe none of the string flags where picked up by PETSc. What am I missing? > > KSPSetFromOptions() > > Thanks, > Pierre > > > Thank you for your help. > > Kind regards, > Karthik. > This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses. -------------- next part -------------- An HTML attachment was scrubbed... URL: From karthikeyan.chockalingam at stfc.ac.uk Fri Nov 18 08:03:56 2022 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Fri, 18 Nov 2022 14:03:56 +0000 Subject: [petsc-users] PetscOptionsSetValue syntax In-Reply-To: References: Message-ID: Thank you. It worked! Best, Karthik. From: Pierre Jolivet Date: Friday, 18 November 2022 at 13:56 To: Chockalingam, Karthikeyan (STFC,DL,HC) Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] PetscOptionsSetValue syntax On 18 Nov 2022, at 2:47 PM, Karthikeyan Chockalingam - STFC UKRI wrote: ? Thank you. I added ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr); before PetscOptionsSetValue but still I don?t see the ksp iterations or any other info flags I set. I am not passing anything via the command line and believe PetscOptionsSetValue would set the flags. If you call KSPSetFromOptions() before you set any options, there is nothing to set. Usually, you want to delay calling that function up until you are done setting options. Thanks, Pierre Best, Karthik. From: Pierre Jolivet Date: Friday, 18 November 2022 at 13:36 To: Chockalingam, Karthikeyan (STFC,DL,HC) Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] PetscOptionsSetValue syntax On 18 Nov 2022, at 2:32 PM, Karthikeyan Chockalingam - STFC UKRI via petsc-users wrote: ? Hello, I would like to move from code to using options. KSP ksp; PC pc; KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetOperators(ksp, A, A); I tried using PetscOptionsSetValue as below ierr = PetscOptionsSetValue(NULL,"-ksp_type", "cg"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-pc_type", "jacobi"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-ksp_monitor", NULL);CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-ksp_view", NULL);CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-ksp_monitor_true_residual", NULL);CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-ksp_converged_reason", NULL);CHKERRQ(ierr); KSPSolve(ksp, b, x); But I believe none of the string flags where picked up by PETSc. What am I missing? KSPSetFromOptions() Thanks, Pierre Thank you for your help. Kind regards, Karthik. This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses. -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Sat Nov 19 02:52:57 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Sat, 19 Nov 2022 09:52:57 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: <59032746-0A26-40CE-BCE1-FF74932B27EA@petsc.dev> <875yfg9isb.fsf@jedbrown.org> Message-ID: Morning Guys, The good news is that finally fieldsplit is working ok! I have just a simple question about the interaction of MatMPIAIJSetPreallocation with MatSetValuesBlocked. In this simple example I have a 27x27 matrix (9 blocks, each composed by a 3x3 matrix). The only entry I have on each line of the matrix is 1.0 in the diagonal. I set the block size of the matrix to 3 with call MatSetBlockSize(A, 3, ierr) as discussed, and I pass a dnnz(1:27) = 1, since every line of the matrix has just 1 diagonal entry. Now, using MatSetValuesBlocked with this preallocation leads to a wrong matrix, only the first line of the block is assigned correctly. I get A(1,1) = 1, A(2,2) = 0, A(3,3) = 0, A(4,4) = 1, A(5,5) = 0, A(6,6) = 0 and so on. Instead, if I set dnnz = 3, i correctly get I get A(1,1) = 1, A(2,2) = 1, A(3,3) = 1, A(4,4) = 1, A(5,5) = 1, A(6,6) = 1 and so on. Why should I say that dnnz = 3 instead of 1? Is MatSetValuesBlocked kind of requesting this? Thank you! -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Sat Nov 19 09:38:09 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Sat, 19 Nov 2022 16:38:09 +0100 Subject: [petsc-users] On PCFIELDSPLIT and its implementation In-Reply-To: References: <59032746-0A26-40CE-BCE1-FF74932B27EA@petsc.dev> <875yfg9isb.fsf@jedbrown.org> Message-ID: Please ignore me, I was just making a mistake with the number of zeros, with JEd's suggestion to use MatXAIJSetPreallocation I can do a very bespoke code and everything looks good. I'll test the field splitting a bit to see if I can find some performance! Cheers -------------- next part -------------- An HTML attachment was scrubbed... URL: From narnoldm at umich.edu Sat Nov 19 16:16:37 2022 From: narnoldm at umich.edu (Nicholas Arnold-Medabalimi) Date: Sat, 19 Nov 2022 17:16:37 -0500 Subject: [petsc-users] PetscSF Fortran interface In-Reply-To: References: Message-ID: Hi Junchao Thanks. I was wondering if there is any update on this. I may write a small interface for those two routines myself in the interim but I'd appreciate any insight you have. Sincerely Nicholas On Wed, Nov 16, 2022 at 10:39 PM Junchao Zhang wrote: > Hi, Nicholas, > I will have a look and get back to you. > Thanks. > --Junchao Zhang > > > On Wed, Nov 16, 2022 at 9:27 PM Nicholas Arnold-Medabalimi < > narnoldm at umich.edu> wrote: > >> Hi Petsc Users >> >> I'm in the process of adding some Petsc for mesh management into an >> existing Fortran Solver. It has been relatively straightforward so far but >> I am running into an issue with using PetscSF routines. Some like the >> PetscSFGetGraph work no problem but a few of my routines require the use of >> PetscSFGetLeafRanks and PetscSFGetRootRanks and those don't seem to be in >> the fortran interface and I just get a linking error. I also don't seem to >> see a PetscSF file in the finclude. Any clarification or assistance would >> be appreciated. >> >> >> Sincerely >> Nicholas >> >> -- >> Nicholas Arnold-Medabalimi >> >> Ph.D. Candidate >> Computational Aeroscience Lab >> University of Michigan >> > -- Nicholas Arnold-Medabalimi Ph.D. Candidate Computational Aeroscience Lab University of Michigan -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Sat Nov 19 19:21:31 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Sat, 19 Nov 2022 19:21:31 -0600 Subject: [petsc-users] PetscSF Fortran interface In-Reply-To: References: Message-ID: Hi, Nicholas, See this MR, https://gitlab.com/petsc/petsc/-/merge_requests/5860 It is in testing, but you can try branch jczhang/add-petscsf-fortran to see if it works for you. Thanks. --Junchao Zhang On Sat, Nov 19, 2022 at 4:16 PM Nicholas Arnold-Medabalimi < narnoldm at umich.edu> wrote: > Hi Junchao > > Thanks. I was wondering if there is any update on this. I may write a > small interface for those two routines myself in the interim but I'd > appreciate any insight you have. > > Sincerely > Nicholas > > On Wed, Nov 16, 2022 at 10:39 PM Junchao Zhang > wrote: > >> Hi, Nicholas, >> I will have a look and get back to you. >> Thanks. >> --Junchao Zhang >> >> >> On Wed, Nov 16, 2022 at 9:27 PM Nicholas Arnold-Medabalimi < >> narnoldm at umich.edu> wrote: >> >>> Hi Petsc Users >>> >>> I'm in the process of adding some Petsc for mesh management into an >>> existing Fortran Solver. It has been relatively straightforward so far but >>> I am running into an issue with using PetscSF routines. Some like the >>> PetscSFGetGraph work no problem but a few of my routines require the use of >>> PetscSFGetLeafRanks and PetscSFGetRootRanks and those don't seem to be in >>> the fortran interface and I just get a linking error. I also don't seem to >>> see a PetscSF file in the finclude. Any clarification or assistance would >>> be appreciated. >>> >>> >>> Sincerely >>> Nicholas >>> >>> -- >>> Nicholas Arnold-Medabalimi >>> >>> Ph.D. Candidate >>> Computational Aeroscience Lab >>> University of Michigan >>> >> > > -- > Nicholas Arnold-Medabalimi > > Ph.D. Candidate > Computational Aeroscience Lab > University of Michigan > -------------- next part -------------- An HTML attachment was scrubbed... URL: From narnoldm at umich.edu Sat Nov 19 20:04:46 2022 From: narnoldm at umich.edu (Nicholas Arnold-Medabalimi) Date: Sat, 19 Nov 2022 21:04:46 -0500 Subject: [petsc-users] PetscSF Fortran interface In-Reply-To: References: Message-ID: Hi Thanks, this is awesome. Thanks for the very prompt fix. Just one question: will the array outputs on the fortran side copies (and need to be deallocated) or direct access to the dmplex? Sincerely Nicholas On Sat, Nov 19, 2022 at 8:21 PM Junchao Zhang wrote: > Hi, Nicholas, > See this MR, https://gitlab.com/petsc/petsc/-/merge_requests/5860 > It is in testing, but you can try branch jczhang/add-petscsf-fortran to > see if it works for you. > > Thanks. > --Junchao Zhang > > On Sat, Nov 19, 2022 at 4:16 PM Nicholas Arnold-Medabalimi < > narnoldm at umich.edu> wrote: > >> Hi Junchao >> >> Thanks. I was wondering if there is any update on this. I may write a >> small interface for those two routines myself in the interim but I'd >> appreciate any insight you have. >> >> Sincerely >> Nicholas >> >> On Wed, Nov 16, 2022 at 10:39 PM Junchao Zhang >> wrote: >> >>> Hi, Nicholas, >>> I will have a look and get back to you. >>> Thanks. >>> --Junchao Zhang >>> >>> >>> On Wed, Nov 16, 2022 at 9:27 PM Nicholas Arnold-Medabalimi < >>> narnoldm at umich.edu> wrote: >>> >>>> Hi Petsc Users >>>> >>>> I'm in the process of adding some Petsc for mesh management into an >>>> existing Fortran Solver. It has been relatively straightforward so far but >>>> I am running into an issue with using PetscSF routines. Some like the >>>> PetscSFGetGraph work no problem but a few of my routines require the use of >>>> PetscSFGetLeafRanks and PetscSFGetRootRanks and those don't seem to be in >>>> the fortran interface and I just get a linking error. I also don't seem to >>>> see a PetscSF file in the finclude. Any clarification or assistance would >>>> be appreciated. >>>> >>>> >>>> Sincerely >>>> Nicholas >>>> >>>> -- >>>> Nicholas Arnold-Medabalimi >>>> >>>> Ph.D. Candidate >>>> Computational Aeroscience Lab >>>> University of Michigan >>>> >>> >> >> -- >> Nicholas Arnold-Medabalimi >> >> Ph.D. Candidate >> Computational Aeroscience Lab >> University of Michigan >> > -- Nicholas Arnold-Medabalimi Ph.D. Candidate Computational Aeroscience Lab University of Michigan -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Sat Nov 19 20:44:32 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Sat, 19 Nov 2022 20:44:32 -0600 Subject: [petsc-users] PetscSF Fortran interface In-Reply-To: References: Message-ID: On Sat, Nov 19, 2022 at 8:05 PM Nicholas Arnold-Medabalimi < narnoldm at umich.edu> wrote: > Hi > > Thanks, this is awesome. Thanks for the very prompt fix. Just one > question: will the array outputs on the fortran side copies (and need to be > deallocated) or direct access to the dmplex? > Direct access to internal data; no need to deallocate > > Sincerely > Nicholas > > On Sat, Nov 19, 2022 at 8:21 PM Junchao Zhang > wrote: > >> Hi, Nicholas, >> See this MR, https://gitlab.com/petsc/petsc/-/merge_requests/5860 >> It is in testing, but you can try branch jczhang/add-petscsf-fortran to >> see if it works for you. >> >> Thanks. >> --Junchao Zhang >> >> On Sat, Nov 19, 2022 at 4:16 PM Nicholas Arnold-Medabalimi < >> narnoldm at umich.edu> wrote: >> >>> Hi Junchao >>> >>> Thanks. I was wondering if there is any update on this. I may write a >>> small interface for those two routines myself in the interim but I'd >>> appreciate any insight you have. >>> >>> Sincerely >>> Nicholas >>> >>> On Wed, Nov 16, 2022 at 10:39 PM Junchao Zhang >>> wrote: >>> >>>> Hi, Nicholas, >>>> I will have a look and get back to you. >>>> Thanks. >>>> --Junchao Zhang >>>> >>>> >>>> On Wed, Nov 16, 2022 at 9:27 PM Nicholas Arnold-Medabalimi < >>>> narnoldm at umich.edu> wrote: >>>> >>>>> Hi Petsc Users >>>>> >>>>> I'm in the process of adding some Petsc for mesh management into an >>>>> existing Fortran Solver. It has been relatively straightforward so far but >>>>> I am running into an issue with using PetscSF routines. Some like the >>>>> PetscSFGetGraph work no problem but a few of my routines require the use of >>>>> PetscSFGetLeafRanks and PetscSFGetRootRanks and those don't seem to be in >>>>> the fortran interface and I just get a linking error. I also don't seem to >>>>> see a PetscSF file in the finclude. Any clarification or assistance would >>>>> be appreciated. >>>>> >>>>> >>>>> Sincerely >>>>> Nicholas >>>>> >>>>> -- >>>>> Nicholas Arnold-Medabalimi >>>>> >>>>> Ph.D. Candidate >>>>> Computational Aeroscience Lab >>>>> University of Michigan >>>>> >>>> >>> >>> -- >>> Nicholas Arnold-Medabalimi >>> >>> Ph.D. Candidate >>> Computational Aeroscience Lab >>> University of Michigan >>> >> > > -- > Nicholas Arnold-Medabalimi > > Ph.D. Candidate > Computational Aeroscience Lab > University of Michigan > -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Sun Nov 20 04:21:33 2022 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Sun, 20 Nov 2022 11:21:33 +0100 Subject: [petsc-users] Question on Hypre customization within a split Message-ID: Hello Barry/Matt/Jed, I am going on with testing and now the field split works great. Thanks a lot for the support! I have a question today about Hypre customization for a target split. let's say that my split 1 use hypre: * -UPfieldsplit_1_pc_type hypre *(this works ok) How can I then customize Hypre parameters for that split? I have tried something like: *-**UPfieldsplit_1_pc_hypre_boomeramg_coarsen_type something* But it does not look to work. What am I missing? Thank you! -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Nov 20 07:00:14 2022 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 20 Nov 2022 08:00:14 -0500 Subject: [petsc-users] Question on Hypre customization within a split In-Reply-To: References: Message-ID: On Sun, Nov 20, 2022 at 5:21 AM Edoardo alinovi wrote: > Hello Barry/Matt/Jed, > > I am going on with testing and now the field split works great. Thanks a > lot for the support! > > I have a question today about Hypre customization for a target split. > let's say that my split 1 use hypre: * -UPfieldsplit_1_pc_type hypre *(this > works ok) > > How can I then customize Hypre parameters for that split? I have tried > something like: > > *-**UPfieldsplit_1_pc_hypre_boomeramg_coarsen_type something* > > But it does not look to work. What am I missing? > This should work. The right way to debug this stuff is to first use -ksp_view, or maybe -UPksp_view if that is your prefix. In that output, all objects have their prefix stated, so it should be easy to see what to use, and also to see the values. Thanks, Matt > Thank you! > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Sun Nov 20 12:25:49 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Sun, 20 Nov 2022 12:25:49 -0600 Subject: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases In-Reply-To: References: Message-ID: On Tue, Nov 15, 2022 at 10:55 AM Fackler, Philip wrote: > I built petsc with: > > $ ./configure PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug > --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-debugging=0 > --prefix=$HOME/build/petsc/debug/install --with-64-bit-indices > --with-shared-libraries --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --download-kokkos > --download-kokkos-kernels > > $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug all > > $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug install > > > Then I build xolotl in a separate build directory (after checking out the > "feature-petsc-kokkos" branch) with: > > $ cmake -DCMAKE_BUILD_TYPE=Debug > -DKokkos_DIR=$HOME/build/petsc/debug/install > -DPETSC_DIR=$HOME/build/petsc/debug/install > > $ make -j4 SystemTester > Hi, Philip, I tried multiple times and still failed at building xolotl. I installed boost-1.74 and HDF5, and used gcc-11.3. make -j4 SystemTester ... [ 9%] Building CXX object xolotl/core/CMakeFiles/xolotlCore.dir/src/diffusion/DiffusionHandler.cpp.o /home/jczhang/xolotl/xolotl/core/src/diffusion/DiffusionHandler.cpp(55): error: no instance of overloaded function "std::vector<_Tp, _Alloc>::push_back [with _Tp=xolotl::core::RowColPair, _Alloc=std::allocator]" matches the argument list argument types are: ({...}) object type is: std::vector> 1 error detected in the compilation of "/home/jczhang/xolotl/xolotl/core/src/diffusion/DiffusionHandler.cpp". > > > Then, from the xolotl build directory, run (for example): > > $ mpirun -n 2 ./test/system/SystemTester -t System/NE_4 -- -v > > Note that this test case will use the parameter file > '/benchmarks/params_system_NE_4.txt' which has the command-line > arguments for petsc in its "petscArgs=..." line. If you look at > '/test/system/SystemTester.cpp' all the system test cases > follow the same naming convention with their corresponding parameter files > under '/benchmarks'. > > The failure happens with the NE_4 case (which is 2D) and the PSI_3 case > (which is 1D). > > Let me know if this is still unclear. > > Thanks, > > > *Philip Fackler * > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > *Oak Ridge National Laboratory* > ------------------------------ > *From:* Junchao Zhang > *Sent:* Tuesday, November 15, 2022 00:16 > *To:* Fackler, Philip > *Cc:* petsc-users at mcs.anl.gov ; Blondel, Sophie < > sblondel at utk.edu> > *Subject:* [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO > interface crashes in some cases > > Hi, Philip, > Can you tell me instructions to build Xolotl to reproduce the error? > --Junchao Zhang > > > On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use > the COO interface for preallocating and setting values in the Jacobian > matrix. I have found that with some of our test cases, using more than one > MPI rank results in a crash. Way down in the preconditioner code in petsc a > Mat gets computed that has "null" for the "productsymbolic" member of its > "ops". It's pretty far removed from where we compute the Jacobian entries, > so I haven't been able (so far) to track it back to an error in my code. > I'd appreciate some help with this from someone who is more familiar with > the petsc guts so we can figure out what I'm doing wrong. (I'm assuming > it's a bug in Xolotl.) > > Note that this is using the kokkos backend for Mat and Vec in petsc, but > with a serial-only build of kokkos and kokkos-kernels. So, it's a CPU-only > multiple MPI rank run. > > Here's a paste of the error output showing the relevant parts of the call > stack: > > [ERROR] [0]PETSC ERROR: > [ERROR] --------------------- Error Message > -------------------------------------------------------------- > [ERROR] [1]PETSC ERROR: > [ERROR] --------------------- Error Message > -------------------------------------------------------------- > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] No support for this operation for this object type > [ERROR] [1]PETSC ERROR: > [ERROR] No support for this operation for this object type > [ERROR] [0]PETSC ERROR: > [ERROR] No method productsymbolic for Mat of type (null) > [ERROR] No method productsymbolic for Mat of type (null) > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. > [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT > Date: 2022-10-28 14:39:41 +0000 > [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT > Date: 2022-10-28 14:39:41 +0000 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 > [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc > PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc > --with-cxx=mpicxx --with-fc=0 --with-cudac=0 > --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices > --with-shared-libraries > --with-kokkos-dir=/home/4pf/build/kokkos/serial/install > --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install > [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc > PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc > --with-cxx=mpicxx --with-fc=0 --with-cudac=0 > --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices > --with-shared-libraries > --with-kokkos-dir=/home/4pf/build/kokkos/serial/install > --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 > [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 > [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #3 MatProductSymbolic() at > /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 > [ERROR] #3 MatProductSymbolic() at > /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #4 MatProduct_Private() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 > [ERROR] #4 MatProduct_Private() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #5 MatMatMult() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 > [ERROR] #5 MatMatMult() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #6 PCGAMGOptProlongator_AGG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 > [ERROR] #6 PCGAMGOptProlongator_AGG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #7 PCSetUp_GAMG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 > [ERROR] #7 PCSetUp_GAMG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #8 PCSetUp() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 > [ERROR] #8 PCSetUp() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #9 KSPSetUp() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 > [ERROR] #9 KSPSetUp() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #10 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 > [ERROR] #10 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #11 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] #11 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #12 PCApply_FieldSplit() at > /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 > [ERROR] #12 PCApply_FieldSplit() at > /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #13 PCApply() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 > [ERROR] #13 PCApply() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #14 KSP_PCApply() at > /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 > [ERROR] #14 KSP_PCApply() at > /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #15 KSPFGMRESCycle() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 > [ERROR] #15 KSPFGMRESCycle() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #16 KSPSolve_FGMRES() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 > [ERROR] #16 KSPSolve_FGMRES() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #17 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 > [ERROR] #17 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #18 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] #18 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #19 SNESSolve_NEWTONLS() at > /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 > [ERROR] #19 SNESSolve_NEWTONLS() at > /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #20 SNESSolve() at > /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 > [ERROR] #20 SNESSolve() at > /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #21 TSStep_ARKIMEX() at > /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 > [ERROR] #21 TSStep_ARKIMEX() at > /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 > [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 > [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 > [ERROR] PetscSolver::solve: TSSolve failed. > [ERROR] PetscSolver::solve: TSSolve failed. > Aborting. > Aborting. > > > > Thanks for the help, > > > *Philip Fackler * > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > *Oak Ridge National Laboratory* > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Sun Nov 20 12:31:28 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Sun, 20 Nov 2022 12:31:28 -0600 Subject: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases In-Reply-To: References: Message-ID: Hi, Mark, On Perlmutter, I have export MPICH_GPU_SUPPORT_ENABLED=1 module load cudatoolkit module load PrgEnv-gnu module load craype-accel-nvidia80 $ module list Currently Loaded Modules: 1) craype-x86-milan 4) perftools-base/22.06.0 7) xalt/2.10.2 10) Nsight-Systems/2022.2.1 13) cray-dsmml/0.2.2 16) PrgEnv-gnu/8.3.3 2) libfabric/1.15.0.0 5) xpmem/2.4.4-2.3_13.8__gff0e1d9.shasta 8) gpu/1.0 11) cudatoolkit/11.7 14) cray-mpich/8.1.17 17) craype-accel-nvidia80 3) craype-network-ofi 6) gcc/11.2.0 9) Nsight-Compute/2022.1.1 12) craype/2.7.16 15) cray-libsci/ 21.08.1.2 18) cmake/3.22.0 And my petsc configure is as follows and I can build petsc on it. As I knew, the only problem on perlmutter is KK failed to find TPL. Turning off TPL is a workaround. '--with-debugging', '--with-cc=cc', '--with-cxx=CC', '--with-fc=ftn', '--download-sowing-cc=cc', # cc might be nvc '--CFLAGS=-g -O0', '--FFLAGS=-g -O0', '--CXXFLAGS=-g -O0', '--with-cuda', '--with-cudac=nvcc', '--download-kokkos', '--download-kokkos-kernels', '--download-kokkos-commit=origin/develop', '--download-kokkos-kernels-commit=origin/develop', '--with-kokkos-kernels-tpl=0', --Junchao Zhang On Wed, Nov 16, 2022 at 7:05 AM Mark Adams wrote: > I can not build right now on Crusher or Perlmutter but I saw this on both. > > Here is an example output using src/snes/tests/ex13.c using the appended > .petscrc > This uses 64 processors and the 8 processor case worked. This has been > semi-nondertminisitc for me. > > (and I have attached my current Perlmutter problem) > > Hope this helps, > Mark > > -dm_plex_simplex 0 > -dm_plex_dim 3 > -dm_plex_box_lower 0,0,0 > -dm_plex_box_upper 1,1,1 > -petscpartitioner_simple_process_grid 2,2,2 > -potential_petscspace_degree 2 > -snes_max_it 1 > -ksp_max_it 200 > -ksp_type cg > -ksp_rtol 1.e-12 > -ksp_norm_type unpreconditioned > -snes_rtol 1.e-8 > #-pc_type gamg > #-pc_gamg_type agg > #-pc_gamg_agg_nsmooths 1 > -pc_gamg_coarse_eq_limit 100 > -pc_gamg_process_eq_limit 400 > -pc_gamg_reuse_interpolation true > #-snes_monitor > #-ksp_monitor_short > -ksp_converged_reason > #-ksp_view > #-snes_converged_reason > #-mg_levels_ksp_max_it 2 > -mg_levels_ksp_type chebyshev > #-mg_levels_ksp_type richardson > #-mg_levels_ksp_richardson_scale 0.8 > -mg_levels_pc_type jacobi > -pc_gamg_esteig_ksp_type cg > -pc_gamg_esteig_ksp_max_it 10 > -mg_levels_ksp_chebyshev_esteig 0,0.05,0,1.05 > -dm_distribute > -petscpartitioner_type simple > -pc_gamg_repartition false > -pc_gamg_coarse_grid_layout_type compact > -pc_gamg_threshold 0.01 > #-pc_gamg_threshold_scale .5 > -pc_gamg_aggressive_coarsening 1 > #-check_pointer_intensity 0 > -snes_type ksponly > #-mg_coarse_sub_pc_factor_mat_solver_type cusparse > #-info :pc > #-use_gpu_aware_mpi 1 > -options_left > #-malloc_debug > -benchmark_it 10 > #-pc_gamg_use_parallel_coarse_grid_solver > #-mg_coarse_pc_type jacobi > #-mg_coarse_ksp_type cg > #-mg_coarse_ksp_rtol 1.e-2 > #-mat_cusparse_transgen > -snes_lag_jacobian -2 > > > On Tue, Nov 15, 2022 at 3:42 PM Junchao Zhang > wrote: > >> Mark, >> Do you have a reproducer using petsc examples? >> >> On Tue, Nov 15, 2022, 12:49 PM Mark Adams wrote: >> >>> Junchao, this is the same problem that I have been having right? >>> >>> On Tue, Nov 15, 2022 at 11:56 AM Fackler, Philip via petsc-users < >>> petsc-users at mcs.anl.gov> wrote: >>> >>>> I built petsc with: >>>> >>>> $ ./configure PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug >>>> --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-debugging=0 >>>> --prefix=$HOME/build/petsc/debug/install --with-64-bit-indices >>>> --with-shared-libraries --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --download-kokkos >>>> --download-kokkos-kernels >>>> >>>> $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug all >>>> >>>> $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug install >>>> >>>> >>>> Then I build xolotl in a separate build directory (after checking out >>>> the "feature-petsc-kokkos" branch) with: >>>> >>>> $ cmake -DCMAKE_BUILD_TYPE=Debug >>>> -DKokkos_DIR=$HOME/build/petsc/debug/install >>>> -DPETSC_DIR=$HOME/build/petsc/debug/install >>>> >>>> $ make -j4 SystemTester >>>> >>>> >>>> Then, from the xolotl build directory, run (for example): >>>> >>>> $ mpirun -n 2 ./test/system/SystemTester -t System/NE_4 -- -v >>>> >>>> Note that this test case will use the parameter file >>>> '/benchmarks/params_system_NE_4.txt' which has the command-line >>>> arguments for petsc in its "petscArgs=..." line. If you look at >>>> '/test/system/SystemTester.cpp' all the system test cases >>>> follow the same naming convention with their corresponding parameter files >>>> under '/benchmarks'. >>>> >>>> The failure happens with the NE_4 case (which is 2D) and the PSI_3 case >>>> (which is 1D). >>>> >>>> Let me know if this is still unclear. >>>> >>>> Thanks, >>>> >>>> >>>> *Philip Fackler * >>>> Research Software Engineer, Application Engineering Group >>>> Advanced Computing Systems Research Section >>>> Computer Science and Mathematics Division >>>> *Oak Ridge National Laboratory* >>>> ------------------------------ >>>> *From:* Junchao Zhang >>>> *Sent:* Tuesday, November 15, 2022 00:16 >>>> *To:* Fackler, Philip >>>> *Cc:* petsc-users at mcs.anl.gov ; Blondel, >>>> Sophie >>>> *Subject:* [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with >>>> COO interface crashes in some cases >>>> >>>> Hi, Philip, >>>> Can you tell me instructions to build Xolotl to reproduce the error? >>>> --Junchao Zhang >>>> >>>> >>>> On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users < >>>> petsc-users at mcs.anl.gov> wrote: >>>> >>>> In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use >>>> the COO interface for preallocating and setting values in the Jacobian >>>> matrix. I have found that with some of our test cases, using more than one >>>> MPI rank results in a crash. Way down in the preconditioner code in petsc a >>>> Mat gets computed that has "null" for the "productsymbolic" member of its >>>> "ops". It's pretty far removed from where we compute the Jacobian entries, >>>> so I haven't been able (so far) to track it back to an error in my code. >>>> I'd appreciate some help with this from someone who is more familiar with >>>> the petsc guts so we can figure out what I'm doing wrong. (I'm assuming >>>> it's a bug in Xolotl.) >>>> >>>> Note that this is using the kokkos backend for Mat and Vec in petsc, >>>> but with a serial-only build of kokkos and kokkos-kernels. So, it's a >>>> CPU-only multiple MPI rank run. >>>> >>>> Here's a paste of the error output showing the relevant parts of the >>>> call stack: >>>> >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] --------------------- Error Message >>>> -------------------------------------------------------------- >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] --------------------- Error Message >>>> -------------------------------------------------------------- >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] No support for this operation for this object type >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] No support for this operation for this object type >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] No method productsymbolic for Mat of type (null) >>>> [ERROR] No method productsymbolic for Mat of type (null) >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. >>>> [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT >>>> Date: 2022-10-28 14:39:41 +0000 >>>> [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT >>>> Date: 2022-10-28 14:39:41 +0000 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 >>>> 2022 >>>> [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 >>>> 2022 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc >>>> PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc >>>> --with-cxx=mpicxx --with-fc=0 --with-cudac=0 >>>> --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices >>>> --with-shared-libraries >>>> --with-kokkos-dir=/home/4pf/build/kokkos/serial/install >>>> --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install >>>> [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc >>>> PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc >>>> --with-cxx=mpicxx --with-fc=0 --with-cudac=0 >>>> --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices >>>> --with-shared-libraries >>>> --with-kokkos-dir=/home/4pf/build/kokkos/serial/install >>>> --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at >>>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 >>>> [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at >>>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at >>>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 >>>> [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at >>>> /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #3 MatProductSymbolic() at >>>> /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 >>>> [ERROR] #3 MatProductSymbolic() at >>>> /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #4 MatProduct_Private() at >>>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 >>>> [ERROR] #4 MatProduct_Private() at >>>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] #5 MatMatMult() at >>>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 >>>> [ERROR] #5 MatMatMult() at >>>> /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] #6 PCGAMGOptProlongator_AGG() at >>>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 >>>> [ERROR] #6 PCGAMGOptProlongator_AGG() at >>>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] #7 PCSetUp_GAMG() at >>>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 >>>> [ERROR] #7 PCSetUp_GAMG() at >>>> /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #8 PCSetUp() at >>>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 >>>> [ERROR] #8 PCSetUp() at >>>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #9 KSPSetUp() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 >>>> [ERROR] #9 KSPSetUp() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #10 KSPSolve_Private() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 >>>> [ERROR] #10 KSPSolve_Private() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] #11 KSPSolve() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>>> [ERROR] #11 KSPSolve() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #12 PCApply_FieldSplit() at >>>> /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 >>>> [ERROR] #12 PCApply_FieldSplit() at >>>> /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #13 PCApply() at >>>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 >>>> [ERROR] #13 PCApply() at >>>> /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #14 KSP_PCApply() at >>>> /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 >>>> [ERROR] #14 KSP_PCApply() at >>>> /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #15 KSPFGMRESCycle() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 >>>> [ERROR] #15 KSPFGMRESCycle() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #16 KSPSolve_FGMRES() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 >>>> [ERROR] #16 KSPSolve_FGMRES() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #17 KSPSolve_Private() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 >>>> [ERROR] #17 KSPSolve_Private() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] #18 KSPSolve() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>>> [ERROR] #18 KSPSolve() at >>>> /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] #19 SNESSolve_NEWTONLS() at >>>> /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 >>>> [ERROR] #19 SNESSolve_NEWTONLS() at >>>> /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #20 SNESSolve() at >>>> /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 >>>> [ERROR] #20 SNESSolve() at >>>> /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #21 TSStep_ARKIMEX() at >>>> /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 >>>> [ERROR] #21 TSStep_ARKIMEX() at >>>> /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 >>>> [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 >>>> [ERROR] [1]PETSC ERROR: >>>> [ERROR] [0]PETSC ERROR: >>>> [ERROR] #23 TSSolve() at >>>> /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 >>>> [ERROR] #23 TSSolve() at >>>> /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 >>>> [ERROR] PetscSolver::solve: TSSolve failed. >>>> [ERROR] PetscSolver::solve: TSSolve failed. >>>> Aborting. >>>> Aborting. >>>> >>>> >>>> >>>> Thanks for the help, >>>> >>>> >>>> *Philip Fackler * >>>> Research Software Engineer, Application Engineering Group >>>> Advanced Computing Systems Research Section >>>> Computer Science and Mathematics Division >>>> *Oak Ridge National Laboratory* >>>> >>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From facklerpw at ornl.gov Mon Nov 21 09:31:16 2022 From: facklerpw at ornl.gov (Fackler, Philip) Date: Mon, 21 Nov 2022 15:31:16 +0000 Subject: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases In-Reply-To: References: Message-ID: Not sure why. I'm using the same compiler. But you can try constructing the object explicitly on that line: idPairs.push_back(core::RowColPair{i, i}); Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory ________________________________ From: Junchao Zhang Sent: Sunday, November 20, 2022 13:25 To: Fackler, Philip Cc: petsc-users at mcs.anl.gov ; Blondel, Sophie Subject: Re: [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases On Tue, Nov 15, 2022 at 10:55 AM Fackler, Philip > wrote: I built petsc with: $ ./configure PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-debugging=0 --prefix=$HOME/build/petsc/debug/install --with-64-bit-indices --with-shared-libraries --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --download-kokkos --download-kokkos-kernels $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug all $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug install Then I build xolotl in a separate build directory (after checking out the "feature-petsc-kokkos" branch) with: $ cmake -DCMAKE_BUILD_TYPE=Debug -DKokkos_DIR=$HOME/build/petsc/debug/install -DPETSC_DIR=$HOME/build/petsc/debug/install $ make -j4 SystemTester Hi, Philip, I tried multiple times and still failed at building xolotl. I installed boost-1.74 and HDF5, and used gcc-11.3. make -j4 SystemTester ... [ 9%] Building CXX object xolotl/core/CMakeFiles/xolotlCore.dir/src/diffusion/DiffusionHandler.cpp.o /home/jczhang/xolotl/xolotl/core/src/diffusion/DiffusionHandler.cpp(55): error: no instance of overloaded function "std::vector<_Tp, _Alloc>::push_back [with _Tp=xolotl::core::RowColPair, _Alloc=std::allocator]" matches the argument list argument types are: ({...}) object type is: std::vector> 1 error detected in the compilation of "/home/jczhang/xolotl/xolotl/core/src/diffusion/DiffusionHandler.cpp". Then, from the xolotl build directory, run (for example): $ mpirun -n 2 ./test/system/SystemTester -t System/NE_4 -- -v Note that this test case will use the parameter file '/benchmarks/params_system_NE_4.txt' which has the command-line arguments for petsc in its "petscArgs=..." line. If you look at '/test/system/SystemTester.cpp' all the system test cases follow the same naming convention with their corresponding parameter files under '/benchmarks'. The failure happens with the NE_4 case (which is 2D) and the PSI_3 case (which is 1D). Let me know if this is still unclear. Thanks, Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory ________________________________ From: Junchao Zhang > Sent: Tuesday, November 15, 2022 00:16 To: Fackler, Philip > Cc: petsc-users at mcs.anl.gov >; Blondel, Sophie > Subject: [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases Hi, Philip, Can you tell me instructions to build Xolotl to reproduce the error? --Junchao Zhang On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users > wrote: In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use the COO interface for preallocating and setting values in the Jacobian matrix. I have found that with some of our test cases, using more than one MPI rank results in a crash. Way down in the preconditioner code in petsc a Mat gets computed that has "null" for the "productsymbolic" member of its "ops". It's pretty far removed from where we compute the Jacobian entries, so I haven't been able (so far) to track it back to an error in my code. I'd appreciate some help with this from someone who is more familiar with the petsc guts so we can figure out what I'm doing wrong. (I'm assuming it's a bug in Xolotl.) Note that this is using the kokkos backend for Mat and Vec in petsc, but with a serial-only build of kokkos and kokkos-kernels. So, it's a CPU-only multiple MPI rank run. Here's a paste of the error output showing the relevant parts of the call stack: [ERROR] [0]PETSC ERROR: [ERROR] --------------------- Error Message -------------------------------------------------------------- [ERROR] [1]PETSC ERROR: [ERROR] --------------------- Error Message -------------------------------------------------------------- [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] No support for this operation for this object type [ERROR] [1]PETSC ERROR: [ERROR] No support for this operation for this object type [ERROR] [0]PETSC ERROR: [ERROR] No method productsymbolic for Mat of type (null) [ERROR] No method productsymbolic for Mat of type (null) [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT Date: 2022-10-28 14:39:41 +0000 [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT Date: 2022-10-28 14:39:41 +0000 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-cudac=0 --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices --with-shared-libraries --with-kokkos-dir=/home/4pf/build/kokkos/serial/install --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-cudac=0 --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices --with-shared-libraries --with-kokkos-dir=/home/4pf/build/kokkos/serial/install --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #3 MatProductSymbolic() at /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 [ERROR] #3 MatProductSymbolic() at /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #4 MatProduct_Private() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 [ERROR] #4 MatProduct_Private() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #5 MatMatMult() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 [ERROR] #5 MatMatMult() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #6 PCGAMGOptProlongator_AGG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 [ERROR] #6 PCGAMGOptProlongator_AGG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #7 PCSetUp_GAMG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [ERROR] #7 PCSetUp_GAMG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #8 PCSetUp() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 [ERROR] #8 PCSetUp() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #9 KSPSetUp() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 [ERROR] #9 KSPSetUp() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #10 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 [ERROR] #10 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #11 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] #11 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #12 PCApply_FieldSplit() at /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 [ERROR] #12 PCApply_FieldSplit() at /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #13 PCApply() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 [ERROR] #13 PCApply() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #14 KSP_PCApply() at /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 [ERROR] #14 KSP_PCApply() at /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #15 KSPFGMRESCycle() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 [ERROR] #15 KSPFGMRESCycle() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #16 KSPSolve_FGMRES() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 [ERROR] #16 KSPSolve_FGMRES() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #17 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 [ERROR] #17 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #18 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] #18 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #19 SNESSolve_NEWTONLS() at /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 [ERROR] #19 SNESSolve_NEWTONLS() at /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #20 SNESSolve() at /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 [ERROR] #20 SNESSolve() at /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #21 TSStep_ARKIMEX() at /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 [ERROR] #21 TSStep_ARKIMEX() at /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 [ERROR] PetscSolver::solve: TSSolve failed. [ERROR] PetscSolver::solve: TSSolve failed. Aborting. Aborting. Thanks for the help, Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory -------------- next part -------------- An HTML attachment was scrubbed... URL: From narnoldm at umich.edu Mon Nov 21 13:16:33 2022 From: narnoldm at umich.edu (Nicholas Arnold-Medabalimi) Date: Mon, 21 Nov 2022 14:16:33 -0500 Subject: [petsc-users] Petsc Fortran Memory stack trace Message-ID: Hi Petsc users I'm working on an integration of Petsc into an existing fortran code. Most of my memory debugging is very primitive and is usually accomplished using the -check bounds option in the compiler. However with Petsc attached the stack trace becomes much more opaque compared to the original code. At least as far as I can tell the error becomes much harder to pin down (just pointing to libpetsc.so). Any assistance in getting more informative error messages or checks would be much appreciated. Sincerely Nicholas -- Nicholas Arnold-Medabalimi Ph.D. Candidate Computational Aeroscience Lab University of Michigan -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon Nov 21 13:27:13 2022 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 21 Nov 2022 13:27:13 -0600 (CST) Subject: [petsc-users] Petsc Fortran Memory stack trace In-Reply-To: References: Message-ID: <59b30d96-d216-2220-1d0a-56eba57150b0@mcs.anl.gov> valgrind is a useful tool to learn to use.. valgrind --tool=memcheck ./executable Satish On Mon, 21 Nov 2022, Nicholas Arnold-Medabalimi wrote: > Hi Petsc users > > I'm working on an integration of Petsc into an existing fortran code. Most > of my memory debugging is very primitive and is usually accomplished using > the -check bounds option in the compiler. However with Petsc attached the > stack trace becomes much more opaque compared to the original code. At > least as far as I can tell the error becomes much harder to pin down (just > pointing to libpetsc.so). Any assistance in getting more informative error > messages or checks would be much appreciated. > > Sincerely > Nicholas > > From s_g at berkeley.edu Mon Nov 21 13:39:37 2022 From: s_g at berkeley.edu (Sanjay Govindjee) Date: Mon, 21 Nov 2022 11:39:37 -0800 Subject: [petsc-users] Petsc Fortran Memory stack trace In-Reply-To: <59b30d96-d216-2220-1d0a-56eba57150b0@mcs.anl.gov> References: <59b30d96-d216-2220-1d0a-56eba57150b0@mcs.anl.gov> Message-ID: Other options I have found useful: -v --leak-check=full --show-reachable=yes On 11/21/22 11:27 AM, Satish Balay via petsc-users wrote: > valgrind is a useful tool to learn to use.. > > valgrind --tool=memcheck ./executable > > Satish > > On Mon, 21 Nov 2022, Nicholas Arnold-Medabalimi wrote: > >> Hi Petsc users >> >> I'm working on an integration of Petsc into an existing fortran code. Most >> of my memory debugging is very primitive and is usually accomplished using >> the -check bounds option in the compiler. However with Petsc attached the >> stack trace becomes much more opaque compared to the original code. At >> least as far as I can tell the error becomes much harder to pin down (just >> pointing to libpetsc.so). Any assistance in getting more informative error >> messages or checks would be much appreciated. >> >> Sincerely >> Nicholas >> >> From narnoldm at umich.edu Mon Nov 21 13:43:20 2022 From: narnoldm at umich.edu (Nicholas Arnold-Medabalimi) Date: Mon, 21 Nov 2022 14:43:20 -0500 Subject: [petsc-users] Petsc Fortran Memory stack trace In-Reply-To: References: <59b30d96-d216-2220-1d0a-56eba57150b0@mcs.anl.gov> Message-ID: I have been using valgrind with the mem checker. I should have mentioned that. My question was probably ill posed. I'm more asking about is how linking petsc affects the stack trace provided by the compiler side checks. Valgrind is great but sometimes is a little ambiguous whereas the compile side check bounds will usually be more specific so I was curious if there is a way to change the petsc stack trace effect. Thanks On Mon, Nov 21, 2022 at 2:39 PM Sanjay Govindjee wrote: > Other options I have found useful: > > -v --leak-check=full --show-reachable=yes > > On 11/21/22 11:27 AM, Satish Balay via petsc-users wrote: > > valgrind is a useful tool to learn to use.. > > > > valgrind --tool=memcheck ./executable > > > > Satish > > > > On Mon, 21 Nov 2022, Nicholas Arnold-Medabalimi wrote: > > > >> Hi Petsc users > >> > >> I'm working on an integration of Petsc into an existing fortran code. > Most > >> of my memory debugging is very primitive and is usually accomplished > using > >> the -check bounds option in the compiler. However with Petsc attached > the > >> stack trace becomes much more opaque compared to the original code. At > >> least as far as I can tell the error becomes much harder to pin down > (just > >> pointing to libpetsc.so). Any assistance in getting more informative > error > >> messages or checks would be much appreciated. > >> > >> Sincerely > >> Nicholas > >> > >> > > -- Nicholas Arnold-Medabalimi Ph.D. Candidate Computational Aeroscience Lab University of Michigan -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Nov 21 13:46:38 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 21 Nov 2022 14:46:38 -0500 Subject: [petsc-users] Petsc Fortran Memory stack trace In-Reply-To: References: <59b30d96-d216-2220-1d0a-56eba57150b0@mcs.anl.gov> Message-ID: On Mon, Nov 21, 2022 at 2:44 PM Nicholas Arnold-Medabalimi < narnoldm at umich.edu> wrote: > I have been using valgrind with the mem checker. I should have mentioned > that. My question was probably ill posed. I'm more asking about is how > linking petsc affects the stack trace provided by the compiler side checks. > Valgrind is great but sometimes is a little ambiguous whereas the compile > side check bounds will usually be more specific so I was curious if there > is a way to change the petsc stack trace effect. > I am not sure I understand. Valgrind should indicate precisely the line number, unless you have not compiled/linked with debugging symbols. Thanks, Matt > Thanks > > On Mon, Nov 21, 2022 at 2:39 PM Sanjay Govindjee wrote: > >> Other options I have found useful: >> >> -v --leak-check=full --show-reachable=yes >> >> On 11/21/22 11:27 AM, Satish Balay via petsc-users wrote: >> > valgrind is a useful tool to learn to use.. >> > >> > valgrind --tool=memcheck ./executable >> > >> > Satish >> > >> > On Mon, 21 Nov 2022, Nicholas Arnold-Medabalimi wrote: >> > >> >> Hi Petsc users >> >> >> >> I'm working on an integration of Petsc into an existing fortran code. >> Most >> >> of my memory debugging is very primitive and is usually accomplished >> using >> >> the -check bounds option in the compiler. However with Petsc attached >> the >> >> stack trace becomes much more opaque compared to the original code. At >> >> least as far as I can tell the error becomes much harder to pin down >> (just >> >> pointing to libpetsc.so). Any assistance in getting more informative >> error >> >> messages or checks would be much appreciated. >> >> >> >> Sincerely >> >> Nicholas >> >> >> >> >> >> > > -- > Nicholas Arnold-Medabalimi > > Ph.D. Candidate > Computational Aeroscience Lab > University of Michigan > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Mon Nov 21 14:17:48 2022 From: bsmith at petsc.dev (Barry Smith) Date: Mon, 21 Nov 2022 15:17:48 -0500 Subject: [petsc-users] Petsc Fortran Memory stack trace In-Reply-To: References: Message-ID: My understanding of Fortran bounds checking is that before each array access in Fortran it checks to see if the index is valid for the array you are accessing; that is it is from start to end if you had declared the array as double precision, dimension (start:end) :: A It should also work if the array is a Fortran allocatable array or if you obtain the array from PETSc with VecGetArrayF90() and all its friends and relations. PETSc should not change the behavior above. Now if there is memory corruption (or some other error) somewhere else (like in PETSc or a more subtle problem in your code) then simply array out of bounds then yes you can get more complicated error messages that would usually include the PETSc stack trace. Instead of using valgrind you can also try running the PETSc program with -malloc_debug, this is sort of a poor person's version of valgrind but can sometimes provide more useful information than valgrind. When debugging always make sure PETSc was NOT ./configure with --with-debugging=0 You can send specific error messages that are cryptic to petsc-maint at mcs.anl.gov and we may be able to help decipher them. Barry > On Nov 21, 2022, at 2:16 PM, Nicholas Arnold-Medabalimi wrote: > > Hi Petsc users > > I'm working on an integration of Petsc into an existing fortran code. Most of my memory debugging is very primitive and is usually accomplished using the -check bounds option in the compiler. However with Petsc attached the stack trace becomes much more opaque compared to the original code. At least as far as I can tell the error becomes much harder to pin down (just pointing to libpetsc.so). Any assistance in getting more informative error messages or checks would be much appreciated. > > Sincerely > Nicholas > > -- > Nicholas Arnold-Medabalimi > > Ph.D. Candidate > Computational Aeroscience Lab > University of Michigan -------------- next part -------------- An HTML attachment was scrubbed... URL: From narnoldm at umich.edu Mon Nov 21 14:23:57 2022 From: narnoldm at umich.edu (Nicholas Arnold-Medabalimi) Date: Mon, 21 Nov 2022 15:23:57 -0500 Subject: [petsc-users] Petsc Fortran Memory stack trace In-Reply-To: References: Message-ID: Thanks for the information, that clarifies quite a bit. Unfortunately I probably have a number of memory issues that are colliding that I need to clean up. Thanks On Mon, Nov 21, 2022 at 3:18 PM Barry Smith wrote: > > My understanding of Fortran bounds checking is that before each array > access in Fortran it checks to see if the index is valid for the array you > are accessing; that is it is from start to end if you had declared the > array as > > double precision, dimension (start:end) :: A > > It should also work if the array is a Fortran allocatable array or if you > obtain the array from PETSc with VecGetArrayF90() and all its friends and > relations. > > PETSc should not change the behavior above. > > Now if there is memory corruption (or some other error) somewhere else > (like in PETSc or a more subtle problem in your code) then simply array out > of bounds then yes you can get more complicated error messages that would > usually include the PETSc stack trace. > > Instead of using valgrind you can also try running the PETSc program > with -malloc_debug, this is sort of a poor person's version of valgrind but > can sometimes provide more useful information than valgrind. > > When debugging always make sure PETSc was NOT ./configure with > --with-debugging=0 > > You can send specific error messages that are cryptic to > petsc-maint at mcs.anl.gov and we may be able to help decipher them. > > Barry > > > > > > On Nov 21, 2022, at 2:16 PM, Nicholas Arnold-Medabalimi < > narnoldm at umich.edu> wrote: > > Hi Petsc users > > I'm working on an integration of Petsc into an existing fortran code. Most > of my memory debugging is very primitive and is usually accomplished using > the -check bounds option in the compiler. However with Petsc attached the > stack trace becomes much more opaque compared to the original code. At > least as far as I can tell the error becomes much harder to pin down (just > pointing to libpetsc.so). Any assistance in getting more informative error > messages or checks would be much appreciated. > > Sincerely > Nicholas > > -- > Nicholas Arnold-Medabalimi > > Ph.D. Candidate > Computational Aeroscience Lab > University of Michigan > > > -- Nicholas Arnold-Medabalimi Ph.D. Candidate Computational Aeroscience Lab University of Michigan -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Mon Nov 21 14:36:13 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Mon, 21 Nov 2022 14:36:13 -0600 Subject: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases In-Reply-To: References: Message-ID: On Mon, Nov 21, 2022 at 9:31 AM Fackler, Philip wrote: > Not sure why. I'm using the same compiler. But you can try constructing > the object explicitly on that line: > > idPairs.push_back(core::RowColPair{i, i}); > WIth your change, I continued but met another error: /home/jczhang/xolotl/test/core/diffusion/Diffusion2DHandlerTester.cpp(79): error: class "xolotl::core::diffusion::Diffusion2DHandler" has no member "initializeOFill" it seems all these problems are related to the branch * feature-petsc-kokkos, *instead of the compiler etc. When I switched to origin/stable, I could build xolotl. > > > *Philip Fackler * > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > *Oak Ridge National Laboratory* > ------------------------------ > *From:* Junchao Zhang > *Sent:* Sunday, November 20, 2022 13:25 > *To:* Fackler, Philip > *Cc:* petsc-users at mcs.anl.gov ; Blondel, Sophie < > sblondel at utk.edu> > *Subject:* Re: [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with > COO interface crashes in some cases > > > > On Tue, Nov 15, 2022 at 10:55 AM Fackler, Philip > wrote: > > I built petsc with: > > $ ./configure PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug > --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-debugging=0 > --prefix=$HOME/build/petsc/debug/install --with-64-bit-indices > --with-shared-libraries --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --download-kokkos > --download-kokkos-kernels > > $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug all > > $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug install > > > Then I build xolotl in a separate build directory (after checking out the > "feature-petsc-kokkos" branch) with: > > $ cmake -DCMAKE_BUILD_TYPE=Debug > -DKokkos_DIR=$HOME/build/petsc/debug/install > -DPETSC_DIR=$HOME/build/petsc/debug/install > > $ make -j4 SystemTester > > Hi, Philip, I tried multiple times and still failed at building xolotl. > I installed boost-1.74 and HDF5, and used gcc-11.3. > > make -j4 SystemTester > ... > [ 9%] Building CXX object > xolotl/core/CMakeFiles/xolotlCore.dir/src/diffusion/DiffusionHandler.cpp.o > /home/jczhang/xolotl/xolotl/core/src/diffusion/DiffusionHandler.cpp(55): > error: no instance of overloaded function "std::vector<_Tp, > _Alloc>::push_back [with _Tp=xolotl::core::RowColPair, > _Alloc=std::allocator]" matches the argument list > argument types are: ({...}) > object type is: std::vector std::allocator> > > 1 error detected in the compilation of > "/home/jczhang/xolotl/xolotl/core/src/diffusion/DiffusionHandler.cpp". > > > > > Then, from the xolotl build directory, run (for example): > > $ mpirun -n 2 ./test/system/SystemTester -t System/NE_4 -- -v > > Note that this test case will use the parameter file > '/benchmarks/params_system_NE_4.txt' which has the command-line > arguments for petsc in its "petscArgs=..." line. If you look at > '/test/system/SystemTester.cpp' all the system test cases > follow the same naming convention with their corresponding parameter files > under '/benchmarks'. > > The failure happens with the NE_4 case (which is 2D) and the PSI_3 case > (which is 1D). > > Let me know if this is still unclear. > > Thanks, > > > *Philip Fackler * > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > *Oak Ridge National Laboratory* > ------------------------------ > *From:* Junchao Zhang > *Sent:* Tuesday, November 15, 2022 00:16 > *To:* Fackler, Philip > *Cc:* petsc-users at mcs.anl.gov ; Blondel, Sophie < > sblondel at utk.edu> > *Subject:* [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO > interface crashes in some cases > > Hi, Philip, > Can you tell me instructions to build Xolotl to reproduce the error? > --Junchao Zhang > > > On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use > the COO interface for preallocating and setting values in the Jacobian > matrix. I have found that with some of our test cases, using more than one > MPI rank results in a crash. Way down in the preconditioner code in petsc a > Mat gets computed that has "null" for the "productsymbolic" member of its > "ops". It's pretty far removed from where we compute the Jacobian entries, > so I haven't been able (so far) to track it back to an error in my code. > I'd appreciate some help with this from someone who is more familiar with > the petsc guts so we can figure out what I'm doing wrong. (I'm assuming > it's a bug in Xolotl.) > > Note that this is using the kokkos backend for Mat and Vec in petsc, but > with a serial-only build of kokkos and kokkos-kernels. So, it's a CPU-only > multiple MPI rank run. > > Here's a paste of the error output showing the relevant parts of the call > stack: > > [ERROR] [0]PETSC ERROR: > [ERROR] --------------------- Error Message > -------------------------------------------------------------- > [ERROR] [1]PETSC ERROR: > [ERROR] --------------------- Error Message > -------------------------------------------------------------- > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] No support for this operation for this object type > [ERROR] [1]PETSC ERROR: > [ERROR] No support for this operation for this object type > [ERROR] [0]PETSC ERROR: > [ERROR] No method productsymbolic for Mat of type (null) > [ERROR] No method productsymbolic for Mat of type (null) > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. > [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT > Date: 2022-10-28 14:39:41 +0000 > [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT > Date: 2022-10-28 14:39:41 +0000 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 > [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc > PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc > --with-cxx=mpicxx --with-fc=0 --with-cudac=0 > --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices > --with-shared-libraries > --with-kokkos-dir=/home/4pf/build/kokkos/serial/install > --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install > [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc > PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc > --with-cxx=mpicxx --with-fc=0 --with-cudac=0 > --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices > --with-shared-libraries > --with-kokkos-dir=/home/4pf/build/kokkos/serial/install > --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 > [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 > [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #3 MatProductSymbolic() at > /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 > [ERROR] #3 MatProductSymbolic() at > /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #4 MatProduct_Private() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 > [ERROR] #4 MatProduct_Private() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #5 MatMatMult() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 > [ERROR] #5 MatMatMult() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #6 PCGAMGOptProlongator_AGG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 > [ERROR] #6 PCGAMGOptProlongator_AGG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #7 PCSetUp_GAMG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 > [ERROR] #7 PCSetUp_GAMG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #8 PCSetUp() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 > [ERROR] #8 PCSetUp() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #9 KSPSetUp() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 > [ERROR] #9 KSPSetUp() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #10 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 > [ERROR] #10 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #11 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] #11 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #12 PCApply_FieldSplit() at > /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 > [ERROR] #12 PCApply_FieldSplit() at > /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #13 PCApply() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 > [ERROR] #13 PCApply() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #14 KSP_PCApply() at > /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 > [ERROR] #14 KSP_PCApply() at > /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #15 KSPFGMRESCycle() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 > [ERROR] #15 KSPFGMRESCycle() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #16 KSPSolve_FGMRES() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 > [ERROR] #16 KSPSolve_FGMRES() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #17 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 > [ERROR] #17 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #18 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] #18 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #19 SNESSolve_NEWTONLS() at > /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 > [ERROR] #19 SNESSolve_NEWTONLS() at > /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #20 SNESSolve() at > /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 > [ERROR] #20 SNESSolve() at > /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #21 TSStep_ARKIMEX() at > /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 > [ERROR] #21 TSStep_ARKIMEX() at > /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 > [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 > [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 > [ERROR] PetscSolver::solve: TSSolve failed. > [ERROR] PetscSolver::solve: TSSolve failed. > Aborting. > Aborting. > > > > Thanks for the help, > > > *Philip Fackler * > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > *Oak Ridge National Laboratory* > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From facklerpw at ornl.gov Tue Nov 22 10:14:36 2022 From: facklerpw at ornl.gov (Fackler, Philip) Date: Tue, 22 Nov 2022 16:14:36 +0000 Subject: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases In-Reply-To: References: Message-ID: Yes, that one is. I haven't updated the tests. So just build the SystemTester? target or the xolotl? target. Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory ________________________________ From: Junchao Zhang Sent: Monday, November 21, 2022 15:36 To: Fackler, Philip Cc: petsc-users at mcs.anl.gov ; Blondel, Sophie Subject: Re: [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases On Mon, Nov 21, 2022 at 9:31 AM Fackler, Philip > wrote: Not sure why. I'm using the same compiler. But you can try constructing the object explicitly on that line: idPairs.push_back(core::RowColPair{i, i}); WIth your change, I continued but met another error: /home/jczhang/xolotl/test/core/diffusion/Diffusion2DHandlerTester.cpp(79): error: class "xolotl::core::diffusion::Diffusion2DHandler" has no member "initializeOFill" it seems all these problems are related to the branch feature-petsc-kokkos, instead of the compiler etc. When I switched to origin/stable, I could build xolotl. Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory ________________________________ From: Junchao Zhang > Sent: Sunday, November 20, 2022 13:25 To: Fackler, Philip > Cc: petsc-users at mcs.anl.gov >; Blondel, Sophie > Subject: Re: [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases On Tue, Nov 15, 2022 at 10:55 AM Fackler, Philip > wrote: I built petsc with: $ ./configure PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-debugging=0 --prefix=$HOME/build/petsc/debug/install --with-64-bit-indices --with-shared-libraries --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --download-kokkos --download-kokkos-kernels $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug all $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug install Then I build xolotl in a separate build directory (after checking out the "feature-petsc-kokkos" branch) with: $ cmake -DCMAKE_BUILD_TYPE=Debug -DKokkos_DIR=$HOME/build/petsc/debug/install -DPETSC_DIR=$HOME/build/petsc/debug/install $ make -j4 SystemTester Hi, Philip, I tried multiple times and still failed at building xolotl. I installed boost-1.74 and HDF5, and used gcc-11.3. make -j4 SystemTester ... [ 9%] Building CXX object xolotl/core/CMakeFiles/xolotlCore.dir/src/diffusion/DiffusionHandler.cpp.o /home/jczhang/xolotl/xolotl/core/src/diffusion/DiffusionHandler.cpp(55): error: no instance of overloaded function "std::vector<_Tp, _Alloc>::push_back [with _Tp=xolotl::core::RowColPair, _Alloc=std::allocator]" matches the argument list argument types are: ({...}) object type is: std::vector> 1 error detected in the compilation of "/home/jczhang/xolotl/xolotl/core/src/diffusion/DiffusionHandler.cpp". Then, from the xolotl build directory, run (for example): $ mpirun -n 2 ./test/system/SystemTester -t System/NE_4 -- -v Note that this test case will use the parameter file '/benchmarks/params_system_NE_4.txt' which has the command-line arguments for petsc in its "petscArgs=..." line. If you look at '/test/system/SystemTester.cpp' all the system test cases follow the same naming convention with their corresponding parameter files under '/benchmarks'. The failure happens with the NE_4 case (which is 2D) and the PSI_3 case (which is 1D). Let me know if this is still unclear. Thanks, Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory ________________________________ From: Junchao Zhang > Sent: Tuesday, November 15, 2022 00:16 To: Fackler, Philip > Cc: petsc-users at mcs.anl.gov >; Blondel, Sophie > Subject: [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases Hi, Philip, Can you tell me instructions to build Xolotl to reproduce the error? --Junchao Zhang On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users > wrote: In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use the COO interface for preallocating and setting values in the Jacobian matrix. I have found that with some of our test cases, using more than one MPI rank results in a crash. Way down in the preconditioner code in petsc a Mat gets computed that has "null" for the "productsymbolic" member of its "ops". It's pretty far removed from where we compute the Jacobian entries, so I haven't been able (so far) to track it back to an error in my code. I'd appreciate some help with this from someone who is more familiar with the petsc guts so we can figure out what I'm doing wrong. (I'm assuming it's a bug in Xolotl.) Note that this is using the kokkos backend for Mat and Vec in petsc, but with a serial-only build of kokkos and kokkos-kernels. So, it's a CPU-only multiple MPI rank run. Here's a paste of the error output showing the relevant parts of the call stack: [ERROR] [0]PETSC ERROR: [ERROR] --------------------- Error Message -------------------------------------------------------------- [ERROR] [1]PETSC ERROR: [ERROR] --------------------- Error Message -------------------------------------------------------------- [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] No support for this operation for this object type [ERROR] [1]PETSC ERROR: [ERROR] No support for this operation for this object type [ERROR] [0]PETSC ERROR: [ERROR] No method productsymbolic for Mat of type (null) [ERROR] No method productsymbolic for Mat of type (null) [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT Date: 2022-10-28 14:39:41 +0000 [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT Date: 2022-10-28 14:39:41 +0000 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-cudac=0 --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices --with-shared-libraries --with-kokkos-dir=/home/4pf/build/kokkos/serial/install --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-cudac=0 --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices --with-shared-libraries --with-kokkos-dir=/home/4pf/build/kokkos/serial/install --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #3 MatProductSymbolic() at /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 [ERROR] #3 MatProductSymbolic() at /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #4 MatProduct_Private() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 [ERROR] #4 MatProduct_Private() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #5 MatMatMult() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 [ERROR] #5 MatMatMult() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #6 PCGAMGOptProlongator_AGG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 [ERROR] #6 PCGAMGOptProlongator_AGG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #7 PCSetUp_GAMG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [ERROR] #7 PCSetUp_GAMG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #8 PCSetUp() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 [ERROR] #8 PCSetUp() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #9 KSPSetUp() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 [ERROR] #9 KSPSetUp() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #10 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 [ERROR] #10 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #11 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] #11 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #12 PCApply_FieldSplit() at /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 [ERROR] #12 PCApply_FieldSplit() at /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #13 PCApply() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 [ERROR] #13 PCApply() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #14 KSP_PCApply() at /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 [ERROR] #14 KSP_PCApply() at /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #15 KSPFGMRESCycle() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 [ERROR] #15 KSPFGMRESCycle() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #16 KSPSolve_FGMRES() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 [ERROR] #16 KSPSolve_FGMRES() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #17 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 [ERROR] #17 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #18 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] #18 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #19 SNESSolve_NEWTONLS() at /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 [ERROR] #19 SNESSolve_NEWTONLS() at /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #20 SNESSolve() at /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 [ERROR] #20 SNESSolve() at /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #21 TSStep_ARKIMEX() at /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 [ERROR] #21 TSStep_ARKIMEX() at /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 [ERROR] PetscSolver::solve: TSSolve failed. [ERROR] PetscSolver::solve: TSSolve failed. Aborting. Aborting. Thanks for the help, Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Tue Nov 22 11:02:01 2022 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Tue, 22 Nov 2022 11:02:01 -0600 Subject: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases In-Reply-To: References: Message-ID: On Tue, Nov 22, 2022 at 10:14 AM Fackler, Philip wrote: > Yes, that one is. I haven't updated the tests. So just build the > SystemTester target or the xolotl target. > OK, I see. I reproduced the petsc error and am looking into it. Thanks a lot. > > > *Philip Fackler * > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > *Oak Ridge National Laboratory* > ------------------------------ > *From:* Junchao Zhang > *Sent:* Monday, November 21, 2022 15:36 > *To:* Fackler, Philip > *Cc:* petsc-users at mcs.anl.gov ; Blondel, Sophie < > sblondel at utk.edu> > *Subject:* Re: [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with > COO interface crashes in some cases > > > > > On Mon, Nov 21, 2022 at 9:31 AM Fackler, Philip > wrote: > > Not sure why. I'm using the same compiler. But you can try constructing > the object explicitly on that line: > > idPairs.push_back(core::RowColPair{i, i}); > > > WIth your change, I continued but met another error: > > /home/jczhang/xolotl/test/core/diffusion/Diffusion2DHandlerTester.cpp(79): > error: class "xolotl::core::diffusion::Diffusion2DHandler" has no member > "initializeOFill" > > it seems all these problems are related to the branch > * feature-petsc-kokkos, *instead of the compiler etc. When I switched to > origin/stable, I could build xolotl. > > > > > *Philip Fackler * > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > *Oak Ridge National Laboratory* > ------------------------------ > *From:* Junchao Zhang > *Sent:* Sunday, November 20, 2022 13:25 > *To:* Fackler, Philip > *Cc:* petsc-users at mcs.anl.gov ; Blondel, Sophie < > sblondel at utk.edu> > *Subject:* Re: [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with > COO interface crashes in some cases > > > > On Tue, Nov 15, 2022 at 10:55 AM Fackler, Philip > wrote: > > I built petsc with: > > $ ./configure PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug > --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-debugging=0 > --prefix=$HOME/build/petsc/debug/install --with-64-bit-indices > --with-shared-libraries --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --download-kokkos > --download-kokkos-kernels > > $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug all > > $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug install > > > Then I build xolotl in a separate build directory (after checking out the > "feature-petsc-kokkos" branch) with: > > $ cmake -DCMAKE_BUILD_TYPE=Debug > -DKokkos_DIR=$HOME/build/petsc/debug/install > -DPETSC_DIR=$HOME/build/petsc/debug/install > > $ make -j4 SystemTester > > Hi, Philip, I tried multiple times and still failed at building xolotl. > I installed boost-1.74 and HDF5, and used gcc-11.3. > > make -j4 SystemTester > ... > [ 9%] Building CXX object > xolotl/core/CMakeFiles/xolotlCore.dir/src/diffusion/DiffusionHandler.cpp.o > /home/jczhang/xolotl/xolotl/core/src/diffusion/DiffusionHandler.cpp(55): > error: no instance of overloaded function "std::vector<_Tp, > _Alloc>::push_back [with _Tp=xolotl::core::RowColPair, > _Alloc=std::allocator]" matches the argument list > argument types are: ({...}) > object type is: std::vector std::allocator> > > 1 error detected in the compilation of > "/home/jczhang/xolotl/xolotl/core/src/diffusion/DiffusionHandler.cpp". > > > > > Then, from the xolotl build directory, run (for example): > > $ mpirun -n 2 ./test/system/SystemTester -t System/NE_4 -- -v > > Note that this test case will use the parameter file > '/benchmarks/params_system_NE_4.txt' which has the command-line > arguments for petsc in its "petscArgs=..." line. If you look at > '/test/system/SystemTester.cpp' all the system test cases > follow the same naming convention with their corresponding parameter files > under '/benchmarks'. > > The failure happens with the NE_4 case (which is 2D) and the PSI_3 case > (which is 1D). > > Let me know if this is still unclear. > > Thanks, > > > *Philip Fackler * > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > *Oak Ridge National Laboratory* > ------------------------------ > *From:* Junchao Zhang > *Sent:* Tuesday, November 15, 2022 00:16 > *To:* Fackler, Philip > *Cc:* petsc-users at mcs.anl.gov ; Blondel, Sophie < > sblondel at utk.edu> > *Subject:* [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO > interface crashes in some cases > > Hi, Philip, > Can you tell me instructions to build Xolotl to reproduce the error? > --Junchao Zhang > > > On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use > the COO interface for preallocating and setting values in the Jacobian > matrix. I have found that with some of our test cases, using more than one > MPI rank results in a crash. Way down in the preconditioner code in petsc a > Mat gets computed that has "null" for the "productsymbolic" member of its > "ops". It's pretty far removed from where we compute the Jacobian entries, > so I haven't been able (so far) to track it back to an error in my code. > I'd appreciate some help with this from someone who is more familiar with > the petsc guts so we can figure out what I'm doing wrong. (I'm assuming > it's a bug in Xolotl.) > > Note that this is using the kokkos backend for Mat and Vec in petsc, but > with a serial-only build of kokkos and kokkos-kernels. So, it's a CPU-only > multiple MPI rank run. > > Here's a paste of the error output showing the relevant parts of the call > stack: > > [ERROR] [0]PETSC ERROR: > [ERROR] --------------------- Error Message > -------------------------------------------------------------- > [ERROR] [1]PETSC ERROR: > [ERROR] --------------------- Error Message > -------------------------------------------------------------- > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] No support for this operation for this object type > [ERROR] [1]PETSC ERROR: > [ERROR] No support for this operation for this object type > [ERROR] [0]PETSC ERROR: > [ERROR] No method productsymbolic for Mat of type (null) > [ERROR] No method productsymbolic for Mat of type (null) > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. > [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT > Date: 2022-10-28 14:39:41 +0000 > [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT > Date: 2022-10-28 14:39:41 +0000 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 > [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc > PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc > --with-cxx=mpicxx --with-fc=0 --with-cudac=0 > --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices > --with-shared-libraries > --with-kokkos-dir=/home/4pf/build/kokkos/serial/install > --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install > [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc > PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc > --with-cxx=mpicxx --with-fc=0 --with-cudac=0 > --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices > --with-shared-libraries > --with-kokkos-dir=/home/4pf/build/kokkos/serial/install > --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 > [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 > [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at > /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #3 MatProductSymbolic() at > /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 > [ERROR] #3 MatProductSymbolic() at > /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #4 MatProduct_Private() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 > [ERROR] #4 MatProduct_Private() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #5 MatMatMult() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 > [ERROR] #5 MatMatMult() at > /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #6 PCGAMGOptProlongator_AGG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 > [ERROR] #6 PCGAMGOptProlongator_AGG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #7 PCSetUp_GAMG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 > [ERROR] #7 PCSetUp_GAMG() at > /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #8 PCSetUp() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 > [ERROR] #8 PCSetUp() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #9 KSPSetUp() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 > [ERROR] #9 KSPSetUp() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #10 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 > [ERROR] #10 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #11 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] #11 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #12 PCApply_FieldSplit() at > /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 > [ERROR] #12 PCApply_FieldSplit() at > /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #13 PCApply() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 > [ERROR] #13 PCApply() at > /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #14 KSP_PCApply() at > /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 > [ERROR] #14 KSP_PCApply() at > /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #15 KSPFGMRESCycle() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 > [ERROR] #15 KSPFGMRESCycle() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #16 KSPSolve_FGMRES() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 > [ERROR] #16 KSPSolve_FGMRES() at > /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #17 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 > [ERROR] #17 KSPSolve_Private() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #18 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] #18 KSPSolve() at > /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 > [ERROR] [0]PETSC ERROR: > [ERROR] [1]PETSC ERROR: > [ERROR] #19 SNESSolve_NEWTONLS() at > /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 > [ERROR] #19 SNESSolve_NEWTONLS() at > /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #20 SNESSolve() at > /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 > [ERROR] #20 SNESSolve() at > /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #21 TSStep_ARKIMEX() at > /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 > [ERROR] #21 TSStep_ARKIMEX() at > /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 > [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 > [ERROR] [1]PETSC ERROR: > [ERROR] [0]PETSC ERROR: > [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 > [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 > [ERROR] PetscSolver::solve: TSSolve failed. > [ERROR] PetscSolver::solve: TSSolve failed. > Aborting. > Aborting. > > > > Thanks for the help, > > > *Philip Fackler * > Research Software Engineer, Application Engineering Group > Advanced Computing Systems Research Section > Computer Science and Mathematics Division > *Oak Ridge National Laboratory* > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From facklerpw at ornl.gov Tue Nov 22 11:56:39 2022 From: facklerpw at ornl.gov (Fackler, Philip) Date: Tue, 22 Nov 2022 17:56:39 +0000 Subject: [petsc-users] [EXTERNAL] Re: Using multiple MPI ranks with COO interface crashes in some cases In-Reply-To: References: Message-ID: Great! Thank you! Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory ________________________________ From: Junchao Zhang Sent: Tuesday, November 22, 2022 12:02 To: Fackler, Philip Cc: petsc-users at mcs.anl.gov ; Blondel, Sophie Subject: Re: [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases On Tue, Nov 22, 2022 at 10:14 AM Fackler, Philip > wrote: Yes, that one is. I haven't updated the tests. So just build the SystemTester target or the xolotl target. OK, I see. I reproduced the petsc error and am looking into it. Thanks a lot. Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory ________________________________ From: Junchao Zhang > Sent: Monday, November 21, 2022 15:36 To: Fackler, Philip > Cc: petsc-users at mcs.anl.gov >; Blondel, Sophie > Subject: Re: [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases On Mon, Nov 21, 2022 at 9:31 AM Fackler, Philip > wrote: Not sure why. I'm using the same compiler. But you can try constructing the object explicitly on that line: idPairs.push_back(core::RowColPair{i, i}); WIth your change, I continued but met another error: /home/jczhang/xolotl/test/core/diffusion/Diffusion2DHandlerTester.cpp(79): error: class "xolotl::core::diffusion::Diffusion2DHandler" has no member "initializeOFill" it seems all these problems are related to the branch feature-petsc-kokkos, instead of the compiler etc. When I switched to origin/stable, I could build xolotl. Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory ________________________________ From: Junchao Zhang > Sent: Sunday, November 20, 2022 13:25 To: Fackler, Philip > Cc: petsc-users at mcs.anl.gov >; Blondel, Sophie > Subject: Re: [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases On Tue, Nov 15, 2022 at 10:55 AM Fackler, Philip > wrote: I built petsc with: $ ./configure PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-debugging=0 --prefix=$HOME/build/petsc/debug/install --with-64-bit-indices --with-shared-libraries --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --download-kokkos --download-kokkos-kernels $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug all $ make PETSC_DIR=$PWD PETSC_ARCH=arch-kokkos-serial-debug install Then I build xolotl in a separate build directory (after checking out the "feature-petsc-kokkos" branch) with: $ cmake -DCMAKE_BUILD_TYPE=Debug -DKokkos_DIR=$HOME/build/petsc/debug/install -DPETSC_DIR=$HOME/build/petsc/debug/install $ make -j4 SystemTester Hi, Philip, I tried multiple times and still failed at building xolotl. I installed boost-1.74 and HDF5, and used gcc-11.3. make -j4 SystemTester ... [ 9%] Building CXX object xolotl/core/CMakeFiles/xolotlCore.dir/src/diffusion/DiffusionHandler.cpp.o /home/jczhang/xolotl/xolotl/core/src/diffusion/DiffusionHandler.cpp(55): error: no instance of overloaded function "std::vector<_Tp, _Alloc>::push_back [with _Tp=xolotl::core::RowColPair, _Alloc=std::allocator]" matches the argument list argument types are: ({...}) object type is: std::vector> 1 error detected in the compilation of "/home/jczhang/xolotl/xolotl/core/src/diffusion/DiffusionHandler.cpp". Then, from the xolotl build directory, run (for example): $ mpirun -n 2 ./test/system/SystemTester -t System/NE_4 -- -v Note that this test case will use the parameter file '/benchmarks/params_system_NE_4.txt' which has the command-line arguments for petsc in its "petscArgs=..." line. If you look at '/test/system/SystemTester.cpp' all the system test cases follow the same naming convention with their corresponding parameter files under '/benchmarks'. The failure happens with the NE_4 case (which is 2D) and the PSI_3 case (which is 1D). Let me know if this is still unclear. Thanks, Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory ________________________________ From: Junchao Zhang > Sent: Tuesday, November 15, 2022 00:16 To: Fackler, Philip > Cc: petsc-users at mcs.anl.gov >; Blondel, Sophie > Subject: [EXTERNAL] Re: [petsc-users] Using multiple MPI ranks with COO interface crashes in some cases Hi, Philip, Can you tell me instructions to build Xolotl to reproduce the error? --Junchao Zhang On Mon, Nov 14, 2022 at 12:24 PM Fackler, Philip via petsc-users > wrote: In Xolotl's "feature-petsc-kokkos" branch, I have moved our code to use the COO interface for preallocating and setting values in the Jacobian matrix. I have found that with some of our test cases, using more than one MPI rank results in a crash. Way down in the preconditioner code in petsc a Mat gets computed that has "null" for the "productsymbolic" member of its "ops". It's pretty far removed from where we compute the Jacobian entries, so I haven't been able (so far) to track it back to an error in my code. I'd appreciate some help with this from someone who is more familiar with the petsc guts so we can figure out what I'm doing wrong. (I'm assuming it's a bug in Xolotl.) Note that this is using the kokkos backend for Mat and Vec in petsc, but with a serial-only build of kokkos and kokkos-kernels. So, it's a CPU-only multiple MPI rank run. Here's a paste of the error output showing the relevant parts of the call stack: [ERROR] [0]PETSC ERROR: [ERROR] --------------------- Error Message -------------------------------------------------------------- [ERROR] [1]PETSC ERROR: [ERROR] --------------------- Error Message -------------------------------------------------------------- [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] No support for this operation for this object type [ERROR] [1]PETSC ERROR: [ERROR] No support for this operation for this object type [ERROR] [0]PETSC ERROR: [ERROR] No method productsymbolic for Mat of type (null) [ERROR] No method productsymbolic for Mat of type (null) [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. [ERROR] See hxxps://petsc.org/release/faq/ for trouble shooting. [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT Date: 2022-10-28 14:39:41 +0000 [ERROR] Petsc Development GIT revision: v3.18.1-115-gdca010e0e9a GIT Date: 2022-10-28 14:39:41 +0000 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 [ERROR] Unknown Name on a named PC0115427 by 4pf Mon Nov 14 13:22:01 2022 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-cudac=0 --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices --with-shared-libraries --with-kokkos-dir=/home/4pf/build/kokkos/serial/install --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install [ERROR] Configure options PETSC_DIR=/home/4pf/repos/petsc PETSC_ARCH=arch-kokkos-serial-debug --with-debugging=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=0 --with-cudac=0 --prefix=/home/4pf/build/petsc/serial-debug/install --with-64-bit-indices --with-shared-libraries --with-kokkos-dir=/home/4pf/build/kokkos/serial/install --with-kokkos-kernels-dir=/home/4pf/build/kokkos-kernels/serial/install [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [ERROR] #1 MatProductSymbolic_MPIAIJKokkos_AB() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:918 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [ERROR] #2 MatProductSymbolic_MPIAIJKokkos() at /home/4pf/repos/petsc/src/mat/impls/aij/mpi/kokkos/mpiaijkok.kokkos.cxx:1138 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #3 MatProductSymbolic() at /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 [ERROR] #3 MatProductSymbolic() at /home/4pf/repos/petsc/src/mat/interface/matproduct.c:793 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #4 MatProduct_Private() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 [ERROR] #4 MatProduct_Private() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9820 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #5 MatMatMult() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 [ERROR] #5 MatMatMult() at /home/4pf/repos/petsc/src/mat/interface/matrix.c:9897 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #6 PCGAMGOptProlongator_AGG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 [ERROR] #6 PCGAMGOptProlongator_AGG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/agg.c:769 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #7 PCSetUp_GAMG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [ERROR] #7 PCSetUp_GAMG() at /home/4pf/repos/petsc/src/ksp/pc/impls/gamg/gamg.c:639 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #8 PCSetUp() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 [ERROR] #8 PCSetUp() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:994 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #9 KSPSetUp() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 [ERROR] #9 KSPSetUp() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:406 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #10 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 [ERROR] #10 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:825 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #11 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] #11 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #12 PCApply_FieldSplit() at /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 [ERROR] #12 PCApply_FieldSplit() at /home/4pf/repos/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1246 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #13 PCApply() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 [ERROR] #13 PCApply() at /home/4pf/repos/petsc/src/ksp/pc/interface/precon.c:441 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #14 KSP_PCApply() at /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 [ERROR] #14 KSP_PCApply() at /home/4pf/repos/petsc/include/petsc/private/kspimpl.h:380 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #15 KSPFGMRESCycle() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 [ERROR] #15 KSPFGMRESCycle() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:152 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #16 KSPSolve_FGMRES() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 [ERROR] #16 KSPSolve_FGMRES() at /home/4pf/repos/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c:273 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #17 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 [ERROR] #17 KSPSolve_Private() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:899 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #18 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] #18 KSPSolve() at /home/4pf/repos/petsc/src/ksp/ksp/interface/itfunc.c:1071 [ERROR] [0]PETSC ERROR: [ERROR] [1]PETSC ERROR: [ERROR] #19 SNESSolve_NEWTONLS() at /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 [ERROR] #19 SNESSolve_NEWTONLS() at /home/4pf/repos/petsc/src/snes/impls/ls/ls.c:210 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #20 SNESSolve() at /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 [ERROR] #20 SNESSolve() at /home/4pf/repos/petsc/src/snes/interface/snes.c:4689 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #21 TSStep_ARKIMEX() at /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 [ERROR] #21 TSStep_ARKIMEX() at /home/4pf/repos/petsc/src/ts/impls/arkimex/arkimex.c:791 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 [ERROR] #22 TSStep() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3445 [ERROR] [1]PETSC ERROR: [ERROR] [0]PETSC ERROR: [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 [ERROR] #23 TSSolve() at /home/4pf/repos/petsc/src/ts/interface/ts.c:3836 [ERROR] PetscSolver::solve: TSSolve failed. [ERROR] PetscSolver::solve: TSSolve failed. Aborting. Aborting. Thanks for the help, Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicola.varini at gmail.com Thu Nov 24 10:19:57 2022 From: nicola.varini at gmail.com (nicola varini) Date: Thu, 24 Nov 2022 17:19:57 +0100 Subject: [petsc-users] comparing HYPRE on CPU vs GPU Message-ID: Dear all, I am comparing the HYPRE boomeramg preconditioner on CPU and GPU. It looks like the defaults are different, therefore I tried to specify the match the options. I have a few observations/questions: 1) Why are the residuals at step 0 different? 2) If the options are the same the number of iterations should be the same? 3) Why on the GPU side the PCApply doesn't seem to be done on GPU? I noticed that if I change certain options, like pc_hypre_boomeramg_relax_type_all symmetric SOR/Jacobi, the KSPSolve get slower and performs on CPU. However, how can I diagnose if the computations are perfomed on GPU or not? Thanks very much, Nicola -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: log_cpu Type: application/octet-stream Size: 16429 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: log_gpu Type: application/octet-stream Size: 17438 bytes Desc: not available URL: From bsmith at petsc.dev Thu Nov 24 13:40:14 2022 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 24 Nov 2022 14:40:14 -0500 Subject: [petsc-users] comparing HYPRE on CPU vs GPU In-Reply-To: References: Message-ID: Probably some of these questions are best asked to the hypre folks. But I can answer some > On Nov 24, 2022, at 11:19 AM, nicola varini wrote: > > Dear all, I am comparing the HYPRE boomeramg preconditioner on CPU and GPU. > It looks like the defaults are different, therefore I tried to specify the match the options. > I have a few observations/questions: > 1) Why are the residuals at step 0 different? By default, PETSc uses left preconditioning; thus, the norms reported are for the preconditioner residual and will be different for different preconditioner options (and possibly slightly different for numerical differences even if run with the same preconditioner options. You can use -ksp_pc_side right to have the true residual norm printed, this should be the same except for possibly, hopefully, small numerics effects. > 2) If the options are the same the number of iterations should be the same? Numerical changes could affect these. Hopefully, not much if all the algorithms are truly the same. > 3) Why on the GPU side the PCApply doesn't seem to be done on GPU? > I noticed that if I change certain options, like pc_hypre_boomeramg_relax_type_all symmetric SOR/Jacobi, the KSPSolve get slower and performs on CPU. Hypre probably does not have, by default, a GPU version of SOR, so it must be run on a CPU (Petsc also does not currently have a GPU SOR). > However, how can I diagnose if the computations are perfomed on GPU or not? Run with -log_view -log_view_gpu_time and look at the final columns that give some information on time and communication to the GPU. But since hypre does not log flops, the flop rates for computations done by hypre are meaningless. > > Thanks very much, > > Nicola > From yuanxi at advancesoft.jp Fri Nov 25 03:04:13 2022 From: yuanxi at advancesoft.jp (=?UTF-8?B?6KKB54WV?=) Date: Fri, 25 Nov 2022 18:04:13 +0900 Subject: [petsc-users] How to introduce new external package into PETSc? Message-ID: Dear PETSc developers, I have my own linear solver and am trying to put it with PETSc as an external solver. I wish to know following details 1. How to let PETSc configure to download (from github) the source code and compile, install it afterwards. 2. How to let PETSc call the solver. I think it may be accomplished by calling PCFactorSetMatSolverPackage. Is this correct? Many thanks, Yuan -------------- next part -------------- An HTML attachment was scrubbed... URL: From narnoldm at umich.edu Fri Nov 25 05:31:51 2022 From: narnoldm at umich.edu (Nicholas Arnold-Medabalimi) Date: Fri, 25 Nov 2022 06:31:51 -0500 Subject: [petsc-users] Fortran DMLoad bindings Message-ID: Good Morning I am adding some Petsc for mesh management into an existing Fortran Solver. I'd like to use the DMLoad() function to read in a generated DMPlex (using DMView from a companion C code I've been using to debug). It appears there isn't an existing binding for that function (or I might be making a mistake.) I noticed some outdated user posts about using the more general PetscObjectView to achieve the result, but I can't seem to replicate it (and it might be outdated information). Any assistance on this would be appreciated. Happy Thanksgiving & Sincerely Nicholas -- Nicholas Arnold-Medabalimi Ph.D. Candidate Computational Aeroscience Lab University of Michigan -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre at joliv.et Fri Nov 25 07:42:06 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Fri, 25 Nov 2022 14:42:06 +0100 Subject: [petsc-users] How to introduce new external package into PETSc? In-Reply-To: References: Message-ID: <00A34A21-4E12-4ED9-B29B-7164221663E5@joliv.et> > On 25 Nov 2022, at 10:04 AM, ?? wrote: > > Dear PETSc developers, > > I have my own linear solver and am trying to put it with PETSc as an external solver. I wish to know following details > > 1. How to let PETSc configure to download (from github) the source code and compile, install it afterwards. You can borrow code from one of the many files in config/BuildSystem/config/packages/*.py Put your file in there and then use --download-name-of-your-file when configuring PETSc. > 2. How to let PETSc call the solver. I think it may be accomplished by calling PCFactorSetMatSolverPackage. Is this correct? You can borrow code from one src/mat/impls/aij/seq/{mkl_pardiso,umfpack,superlu} or src/mat/impls/aij/mpi/mumps to name a few. Thanks, Pierre > Many thanks, > > Yuan From mfadams at lbl.gov Fri Nov 25 11:38:08 2022 From: mfadams at lbl.gov (Mark Adams) Date: Fri, 25 Nov 2022 12:38:08 -0500 Subject: [petsc-users] comparing HYPRE on CPU vs GPU In-Reply-To: References: Message-ID: Also, the coarsening algorithms may be different on the CPU and GPU runs (hypre does coarsening on the GPU). Even if the algorithms are nominally the same there could be differences in, for example, the vertex ordering in greedy coarsening algorithms that results in slightly different "C" point selection on the device, which will change the preconditioner slightly. On Thu, Nov 24, 2022 at 2:40 PM Barry Smith wrote: > > Probably some of these questions are best asked to the hypre folks. But > I can answer some > > > On Nov 24, 2022, at 11:19 AM, nicola varini > wrote: > > > > Dear all, I am comparing the HYPRE boomeramg preconditioner on CPU and > GPU. > > It looks like the defaults are different, therefore I tried to specify > the match the options. > > I have a few observations/questions: > > 1) Why are the residuals at step 0 different? > > By default, PETSc uses left preconditioning; thus, the norms reported > are for the preconditioner residual and will be different for different > preconditioner options (and possibly slightly different for numerical > differences even if run with the same preconditioner options. You can use > -ksp_pc_side right to have the true residual norm printed, this should be > the same except for possibly, hopefully, small numerics effects. > > > 2) If the options are the same the number of iterations should be the > same? > > Numerical changes could affect these. Hopefully, not much if all the > algorithms are truly the same. > > > 3) Why on the GPU side the PCApply doesn't seem to be done on GPU? > > I noticed that if I change certain options, like > pc_hypre_boomeramg_relax_type_all symmetric SOR/Jacobi, the KSPSolve get > slower and performs on CPU. > > Hypre probably does not have, by default, a GPU version of SOR, so it > must be run on a CPU (Petsc also does not currently have a GPU SOR). > > > However, how can I diagnose if the computations are perfomed on GPU or > not? > > Run with -log_view -log_view_gpu_time and look at the final columns that > give some information on time and communication to the GPU. But since hypre > does not log flops, the flop rates for computations done by hypre are > meaningless. > > > > > > Thanks very much, > > > > Nicola > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Fri Nov 25 11:42:18 2022 From: mfadams at lbl.gov (Mark Adams) Date: Fri, 25 Nov 2022 12:42:18 -0500 Subject: [petsc-users] Fortran DMLoad bindings In-Reply-To: References: Message-ID: It looks like it is available with an example here: https://petsc.org/main/src/dm/impls/plex/tutorials/ex3f90.F90.html Try 'cd src/dm/impls/plex/tutorials; make ex3f90' Mark On Fri, Nov 25, 2022 at 6:32 AM Nicholas Arnold-Medabalimi < narnoldm at umich.edu> wrote: > Good Morning > > I am adding some Petsc for mesh management into an existing Fortran > Solver. I'd like to use the DMLoad() function to read in a generated DMPlex > (using DMView from a companion C code I've been using to debug). It appears > there isn't an existing binding for that function (or I might be making a > mistake.) > > I noticed some outdated user posts about using the more general > PetscObjectView to achieve the result, but I can't seem to replicate it > (and it might be outdated information). > > Any assistance on this would be appreciated. > > Happy Thanksgiving & Sincerely > Nicholas > > -- > Nicholas Arnold-Medabalimi > > Ph.D. Candidate > Computational Aeroscience Lab > University of Michigan > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre at joliv.et Fri Nov 25 12:05:25 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Fri, 25 Nov 2022 19:05:25 +0100 Subject: [petsc-users] Fortran DMLoad bindings In-Reply-To: References: Message-ID: That example has no DMLoad(), and the interface is indeed not automatically generated https://gitlab.com/petsc/petsc/-/blob/main/src/dm/interface/dm.c#L4075 I?m not sure why, though. Thanks, Pierre > On 25 Nov 2022, at 6:42 PM, Mark Adams wrote: > > It looks like it is available with an example here: > > https://petsc.org/main/src/dm/impls/plex/tutorials/ex3f90.F90.html > > Try 'cd src/dm/impls/plex/tutorials; make ex3f90' > > Mark > > > > > On Fri, Nov 25, 2022 at 6:32 AM Nicholas Arnold-Medabalimi > wrote: >> Good Morning >> >> I am adding some Petsc for mesh management into an existing Fortran Solver. I'd like to use the DMLoad() function to read in a generated DMPlex (using DMView from a companion C code I've been using to debug). It appears there isn't an existing binding for that function (or I might be making a mistake.) >> >> I noticed some outdated user posts about using the more general PetscObjectView to achieve the result, but I can't seem to replicate it (and it might be outdated information). >> >> Any assistance on this would be appreciated. >> >> Happy Thanksgiving & Sincerely >> Nicholas >> >> -- >> Nicholas Arnold-Medabalimi >> >> Ph.D. Candidate >> Computational Aeroscience Lab >> University of Michigan -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Fri Nov 25 12:34:30 2022 From: bsmith at petsc.dev (Barry Smith) Date: Fri, 25 Nov 2022 13:34:30 -0500 Subject: [petsc-users] Fortran DMLoad bindings In-Reply-To: References: Message-ID: Nicholas I will add the Fortran stubs for these two functions shortly in the git branch barry/2022-11-25/add-dm-view-load-fortran/release Barry > On Nov 25, 2022, at 1:05 PM, Pierre Jolivet wrote: > > That example has no DMLoad(), and the interface is indeed not automatically generated https://gitlab.com/petsc/petsc/-/blob/main/src/dm/interface/dm.c#L4075 > I?m not sure why, though. > > Thanks, > Pierre > >> On 25 Nov 2022, at 6:42 PM, Mark Adams wrote: >> >> It looks like it is available with an example here: >> >> https://petsc.org/main/src/dm/impls/plex/tutorials/ex3f90.F90.html >> >> Try 'cd src/dm/impls/plex/tutorials; make ex3f90' >> >> Mark >> >> >> >> >> On Fri, Nov 25, 2022 at 6:32 AM Nicholas Arnold-Medabalimi > wrote: >>> Good Morning >>> >>> I am adding some Petsc for mesh management into an existing Fortran Solver. I'd like to use the DMLoad() function to read in a generated DMPlex (using DMView from a companion C code I've been using to debug). It appears there isn't an existing binding for that function (or I might be making a mistake.) >>> >>> I noticed some outdated user posts about using the more general PetscObjectView to achieve the result, but I can't seem to replicate it (and it might be outdated information). >>> >>> Any assistance on this would be appreciated. >>> >>> Happy Thanksgiving & Sincerely >>> Nicholas >>> >>> -- >>> Nicholas Arnold-Medabalimi >>> >>> Ph.D. Candidate >>> Computational Aeroscience Lab >>> University of Michigan > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Fri Nov 25 12:44:09 2022 From: bsmith at petsc.dev (Barry Smith) Date: Fri, 25 Nov 2022 13:44:09 -0500 Subject: [petsc-users] Fortran DMLoad bindings In-Reply-To: References: Message-ID: <4ADC8393-924C-4B2D-B18A-51693837D5F4@petsc.dev> The branch is now available for you to use DMLoad() from Fortran > On Nov 25, 2022, at 1:34 PM, Barry Smith wrote: > > > Nicholas > > I will add the Fortran stubs for these two functions shortly in the git branch barry/2022-11-25/add-dm-view-load-fortran/release > > Barry > >> On Nov 25, 2022, at 1:05 PM, Pierre Jolivet wrote: >> >> That example has no DMLoad(), and the interface is indeed not automatically generated https://gitlab.com/petsc/petsc/-/blob/main/src/dm/interface/dm.c#L4075 >> I?m not sure why, though. >> >> Thanks, >> Pierre >> >>> On 25 Nov 2022, at 6:42 PM, Mark Adams wrote: >>> >>> It looks like it is available with an example here: >>> >>> https://petsc.org/main/src/dm/impls/plex/tutorials/ex3f90.F90.html >>> >>> Try 'cd src/dm/impls/plex/tutorials; make ex3f90' >>> >>> Mark >>> >>> >>> >>> >>> On Fri, Nov 25, 2022 at 6:32 AM Nicholas Arnold-Medabalimi > wrote: >>>> Good Morning >>>> >>>> I am adding some Petsc for mesh management into an existing Fortran Solver. I'd like to use the DMLoad() function to read in a generated DMPlex (using DMView from a companion C code I've been using to debug). It appears there isn't an existing binding for that function (or I might be making a mistake.) >>>> >>>> I noticed some outdated user posts about using the more general PetscObjectView to achieve the result, but I can't seem to replicate it (and it might be outdated information). >>>> >>>> Any assistance on this would be appreciated. >>>> >>>> Happy Thanksgiving & Sincerely >>>> Nicholas >>>> >>>> -- >>>> Nicholas Arnold-Medabalimi >>>> >>>> Ph.D. Candidate >>>> Computational Aeroscience Lab >>>> University of Michigan >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From narnoldm at umich.edu Fri Nov 25 13:05:27 2022 From: narnoldm at umich.edu (Nicholas Arnold-Medabalimi) Date: Fri, 25 Nov 2022 14:05:27 -0500 Subject: [petsc-users] Fortran DMLoad bindings In-Reply-To: <4ADC8393-924C-4B2D-B18A-51693837D5F4@petsc.dev> References: <4ADC8393-924C-4B2D-B18A-51693837D5F4@petsc.dev> Message-ID: Hi Thank you so much for the response and assistance. I appreciate it. Sincerely Nicholas On Fri, Nov 25, 2022 at 1:44 PM Barry Smith wrote: > > The branch is now available for you to use DMLoad() from Fortran > > > On Nov 25, 2022, at 1:34 PM, Barry Smith wrote: > > > Nicholas > > I will add the Fortran stubs for these two functions shortly in the git > branch *barry/2022-11-25/add-dm-view-load-fortran/release* > > Barry > > > On Nov 25, 2022, at 1:05 PM, Pierre Jolivet wrote: > > That example has no DMLoad(), and the interface is indeed not > automatically generated > https://gitlab.com/petsc/petsc/-/blob/main/src/dm/interface/dm.c#L4075 > I?m not sure why, though. > > Thanks, > Pierre > > On 25 Nov 2022, at 6:42 PM, Mark Adams wrote: > > It looks like it is available with an example here: > > https://petsc.org/main/src/dm/impls/plex/tutorials/ex3f90.F90.html > > Try 'cd src/dm/impls/plex/tutorials; make ex3f90' > > Mark > > > > > On Fri, Nov 25, 2022 at 6:32 AM Nicholas Arnold-Medabalimi < > narnoldm at umich.edu> wrote: > >> Good Morning >> >> I am adding some Petsc for mesh management into an existing Fortran >> Solver. I'd like to use the DMLoad() function to read in a generated DMPlex >> (using DMView from a companion C code I've been using to debug). It appears >> there isn't an existing binding for that function (or I might be making a >> mistake.) >> >> I noticed some outdated user posts about using the more general >> PetscObjectView to achieve the result, but I can't seem to replicate it >> (and it might be outdated information). >> >> Any assistance on this would be appreciated. >> >> Happy Thanksgiving & Sincerely >> Nicholas >> >> -- >> Nicholas Arnold-Medabalimi >> >> Ph.D. Candidate >> Computational Aeroscience Lab >> University of Michigan >> > > > > -- Nicholas Arnold-Medabalimi Ph.D. Candidate Computational Aeroscience Lab University of Michigan -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhan2355 at purdue.edu Fri Nov 25 23:49:41 2022 From: zhan2355 at purdue.edu (Sijie Zhang) Date: Sat, 26 Nov 2022 05:49:41 +0000 Subject: [petsc-users] make error on cluster Message-ID: Hi, When I try to install petsc, I got the following error. Can you help me with that? Thanks. Sijie ================================================================================================ Running check examples to verify correct installation Using PETSC_DIR=/home/tools/a/zhan2355/petsc-3.18.1 and PETSC_ARCH=arch-linux-c-debug *******************Error detected during compile or link!******************* See https://petsc.org/release/faq/ /home/tools/a/zhan2355/petsc-3.18.1/src/snes/tutorials ex19 ********************************************************************************* mpicc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -Wno-lto-type-mismatch -fstack-protector -fvisibility=hidden -g3 -O0 -I/home/tools/a/zhan2355/petsc-3.18.1/include -I/home/tools/a/zhan2355/petsc-3.18.1/arch-linux-c-debug/include -export-dynamic ex19.c -Wl,-rpath,/home/tools/a/zhan2355/petsc-3.18.1/arch-linux-c-debug/lib -L/home/tools/a/zhan2355/petsc-3.18.1/arch-linux-c-debug/lib -Wl,-rpath,/usr/lib64/openmpi3/lib -L/usr/lib64/openmpi3/lib -Wl,-rpath,/package/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0 -L/package/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0 -Wl,-rpath,/package/gcc/8.3.0/lib/gcc -L/package/gcc/8.3.0/lib/gcc -Wl,-rpath,/package/gcc/8.3.0/lib64 -L/package/gcc/8.3.0/lib64 -Wl,-rpath,/package/gcc/8.3.0/lib -L/package/gcc/8.3.0/lib -lpetsc -llapack -lblas -lm -lX11 -lstdc++ -ldl -lmpi_usempi -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lpthread -lquadmath -lstdc++ -ldl -o ex19 /usr/bin/ld: warning: libgfortran.so.3, needed by /lib/../lib64/liblapack.so, may conflict with libgfortran.so.5 /usr/bin/ld: warning: libgfortran.so.3, needed by /lib/../lib64/liblapack.so, may conflict with libgfortran.so.5 Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process See https://petsc.org/release/faq/ [1669441660.650232] [nut:40436:0] sys.c:618 UCX ERROR shmget(size=2097152 flags=0xfb0) for mm_recv_desc failed: Operation not permitted, please check shared memory limits by 'ipcs -l' lid velocity = 0.0016, prandtl # = 1., grashof # = 1. Number of SNES iterations = 2 Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes See https://petsc.org/release/faq/ [1669441661.708168] [nut:40467:0] sys.c:618 UCX ERROR shmget(size=2097152 flags=0xfb0) for mm_recv_desc failed: Operation not permitted, please check shared memory limits by 'ipcs -l' [1669441661.713950] [nut:40466:0] sys.c:618 UCX ERROR shmget(size=2097152 flags=0xfb0) for mm_recv_desc failed: Operation not permitted, please check shared memory limits by 'ipcs -l' lid velocity = 0.0016, prandtl # = 1., grashof # = 1. Number of SNES iterations = 2 *******************Error detected during compile or link!******************* See https://petsc.org/release/faq/ /home/tools/a/zhan2355/petsc-3.18.1/src/snes/tutorials ex5f ********************************************************* mpif90 -fPIC -Wall -ffree-line-length-none -ffree-line-length-0 -Wno-lto-type-mismatch -Wno-unused-dummy-argument -g -O0 -I/home/tools/a/zhan2355/petsc-3.18.1/include -I/home/tools/a/zhan2355/petsc-3.18.1/arch-linux-c-debug/include ex5f.F90 -Wl,-rpath,/home/tools/a/zhan2355/petsc-3.18.1/arch-linux-c-debug/lib -L/home/tools/a/zhan2355/petsc-3.18.1/arch-linux-c-debug/lib -Wl,-rpath,/usr/lib64/openmpi3/lib -L/usr/lib64/openmpi3/lib -Wl,-rpath,/package/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0 -L/package/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0 -Wl,-rpath,/package/gcc/8.3.0/lib/gcc -L/package/gcc/8.3.0/lib/gcc -Wl,-rpath,/package/gcc/8.3.0/lib64 -L/package/gcc/8.3.0/lib64 -Wl,-rpath,/package/gcc/8.3.0/lib -L/package/gcc/8.3.0/lib -lpetsc -llapack -lblas -lm -lX11 -lstdc++ -ldl -lmpi_usempi -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lpthread -lquadmath -lstdc++ -ldl -o ex5f /usr/bin/ld: warning: libgfortran.so.3, needed by /lib/../lib64/liblapack.so, may conflict with libgfortran.so.5 /usr/bin/ld: warning: libgfortran.so.3, needed by /lib/../lib64/liblapack.so, may conflict with libgfortran.so.5 /usr/bin/ld: warning: libgfortran.so.3, needed by /lib/../lib64/liblapack.so, may conflict with libgfortran.so.5 /usr/bin/ld: warning: libgfortran.so.3, needed by /lib/../lib64/liblapack.so, may conflict with libgfortran.so.5 Possible error running Fortran example src/snes/tutorials/ex5f with 1 MPI process See https://petsc.org/release/faq/ [1669441665.341407] [nut:40669:0] sys.c:618 UCX ERROR shmget(size=2097152 flags=0xfb0) for mm_recv_desc failed: Operation not permitted, please check shared memory limits by 'ipcs -l' Number of SNES iterations = 3 Completed test examples Error while running make check make[1]: *** [check] Error 1 make: *** [check] Error 2 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log.bkp Type: application/octet-stream Size: 53719 bytes Desc: configure.log.bkp URL: From knepley at gmail.com Sat Nov 26 07:13:39 2022 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 26 Nov 2022 07:13:39 -0600 Subject: [petsc-users] make error on cluster In-Reply-To: References: Message-ID: On Fri, Nov 25, 2022 at 11:49 PM Sijie Zhang wrote: > Hi, > > > > When I try to install petsc, I got the following error. Can you help me > with that? > This is an issue with your MPI, which I am guessing was installed by the admin. I would send them this output. However, I did find a suggestion on StackExchange: https://stackoverflow.com/questions/45910849/shmget-operation-not-permitted Thanks, Matt > > > Thanks. > > > > Sijie > > > > > ================================================================================================ > > > > Running check examples to verify correct installation > > Using PETSC_DIR=/home/tools/a/zhan2355/petsc-3.18.1 and > PETSC_ARCH=arch-linux-c-debug > > *******************Error detected during compile or > link!******************* > > See https://petsc.org/release/faq/ > > /home/tools/a/zhan2355/petsc-3.18.1/src/snes/tutorials ex19 > > > ********************************************************************************* > > mpicc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas > -Wno-lto-type-mismatch -fstack-protector -fvisibility=hidden -g3 -O0 > -I/home/tools/a/zhan2355/petsc-3.18.1/include > -I/home/tools/a/zhan2355/petsc-3.18.1/arch-linux-c-debug/include > -export-dynamic ex19.c > -Wl,-rpath,/home/tools/a/zhan2355/petsc-3.18.1/arch-linux-c-debug/lib > -L/home/tools/a/zhan2355/petsc-3.18.1/arch-linux-c-debug/lib > -Wl,-rpath,/usr/lib64/openmpi3/lib -L/usr/lib64/openmpi3/lib > -Wl,-rpath,/package/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0 > -L/package/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0 > -Wl,-rpath,/package/gcc/8.3.0/lib/gcc -L/package/gcc/8.3.0/lib/gcc > -Wl,-rpath,/package/gcc/8.3.0/lib64 -L/package/gcc/8.3.0/lib64 > -Wl,-rpath,/package/gcc/8.3.0/lib -L/package/gcc/8.3.0/lib -lpetsc -llapack > -lblas -lm -lX11 -lstdc++ -ldl -lmpi_usempi -lmpi_mpifh -lmpi -lgfortran > -lm -lgfortran -lm -lgcc_s -lquadmath -lpthread -lquadmath -lstdc++ -ldl -o > ex19 > > /usr/bin/ld: warning: libgfortran.so.3, needed by > /lib/../lib64/liblapack.so, may conflict with libgfortran.so.5 > > /usr/bin/ld: warning: libgfortran.so.3, needed by > /lib/../lib64/liblapack.so, may conflict with libgfortran.so.5 > > Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process > > See https://petsc.org/release/faq/ > > [1669441660.650232] [nut:40436:0] sys.c:618 UCX ERROR > shmget(size=2097152 flags=0xfb0) for mm_recv_desc failed: Operation not > permitted, please check shared memory limits by 'ipcs -l' > > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > > Number of SNES iterations = 2 > > Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes > > See https://petsc.org/release/faq/ > > [1669441661.708168] [nut:40467:0] sys.c:618 UCX ERROR > shmget(size=2097152 flags=0xfb0) for mm_recv_desc failed: Operation not > permitted, please check shared memory limits by 'ipcs -l' > > [1669441661.713950] [nut:40466:0] sys.c:618 UCX ERROR > shmget(size=2097152 flags=0xfb0) for mm_recv_desc failed: Operation not > permitted, please check shared memory limits by 'ipcs -l' > > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > > Number of SNES iterations = 2 > > *******************Error detected during compile or > link!******************* > > See https://petsc.org/release/faq/ > > /home/tools/a/zhan2355/petsc-3.18.1/src/snes/tutorials ex5f > > ********************************************************* > > mpif90 -fPIC -Wall -ffree-line-length-none -ffree-line-length-0 > -Wno-lto-type-mismatch -Wno-unused-dummy-argument -g -O0 > -I/home/tools/a/zhan2355/petsc-3.18.1/include > -I/home/tools/a/zhan2355/petsc-3.18.1/arch-linux-c-debug/include > ex5f.F90 > -Wl,-rpath,/home/tools/a/zhan2355/petsc-3.18.1/arch-linux-c-debug/lib > -L/home/tools/a/zhan2355/petsc-3.18.1/arch-linux-c-debug/lib > -Wl,-rpath,/usr/lib64/openmpi3/lib -L/usr/lib64/openmpi3/lib > -Wl,-rpath,/package/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0 > -L/package/gcc/8.3.0/lib/gcc/x86_64-pc-linux-gnu/8.3.0 > -Wl,-rpath,/package/gcc/8.3.0/lib/gcc -L/package/gcc/8.3.0/lib/gcc > -Wl,-rpath,/package/gcc/8.3.0/lib64 -L/package/gcc/8.3.0/lib64 > -Wl,-rpath,/package/gcc/8.3.0/lib -L/package/gcc/8.3.0/lib -lpetsc -llapack > -lblas -lm -lX11 -lstdc++ -ldl -lmpi_usempi -lmpi_mpifh -lmpi -lgfortran > -lm -lgfortran -lm -lgcc_s -lquadmath -lpthread -lquadmath -lstdc++ -ldl -o > ex5f > > /usr/bin/ld: warning: libgfortran.so.3, needed by > /lib/../lib64/liblapack.so, may conflict with libgfortran.so.5 > > /usr/bin/ld: warning: libgfortran.so.3, needed by > /lib/../lib64/liblapack.so, may conflict with libgfortran.so.5 > > /usr/bin/ld: warning: libgfortran.so.3, needed by > /lib/../lib64/liblapack.so, may conflict with libgfortran.so.5 > > /usr/bin/ld: warning: libgfortran.so.3, needed by > /lib/../lib64/liblapack.so, may conflict with libgfortran.so.5 > > Possible error running Fortran example src/snes/tutorials/ex5f with 1 MPI > process > > See https://petsc.org/release/faq/ > > [1669441665.341407] [nut:40669:0] sys.c:618 UCX ERROR > shmget(size=2097152 flags=0xfb0) for mm_recv_desc failed: Operation not > permitted, please check shared memory limits by 'ipcs -l' > > Number of SNES iterations = 3 > > Completed test examples > > Error while running make check > > make[1]: *** [check] Error 1 > > make: *** [check] Error 2 > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Nov 27 16:43:00 2022 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 27 Nov 2022 17:43:00 -0500 Subject: [petsc-users] Different solution while running in parallel In-Reply-To: References: Message-ID: On Thu, Nov 17, 2022 at 2:34 PM Karthikeyan Chockalingam - STFC UKRI < karthikeyan.chockalingam at stfc.ac.uk> wrote: > I have tried and not sure how to set "-pc_svd_monitor? since, I have not > yet setup the command line option. (I am currently using PETSc within > another framework). > Sorry this email got lost for me. You just call PetscOptionsSetValue(NULL, "-pc_svd_monitor", "1"); > > KSP ksp; > > PC pc; > > KSPCreate(PETSC_COMM_WORLD, &ksp); > > KSPSetOperators(ksp, A, A); > > > > ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr); > > ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); > > ierr = PCSetType(pc,PCSVD);CHKERRQ(ierr); > > ierr = PetscOptionsSetValue(*NULL*,"-pc_svd_monitor", *NULL*); CHKERRQ(ierr); > ?-- is this right? > > ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr); > > ierr = KSPView(ksp, PETSC_VIEWER_STDOUT_SELF);CHKERRQ(ierr); > > > > KSPSolve(ksp, b, x); > > > > ierr = KSPMonitorTrueResidual(ksp,its,norm,vf);CHKERRQ(ierr); > > ierr = KSPMonitorSingularValue(ksp, its, norm, vf);CHKERRQ(ierr); > > > > > > Yes, now using PCSVD both the serial and parallel version produce the same > result. > > > > (i) What does this imply? > Your matrix is singular > (ii) Would I be able to solve using CG preconditioned > using Hypre as I scale the problem? > Only if Hypre uses SVD on the coarse grid. I am not sure they can do that. > (iii) I have not built PETSc with SLEPc ? can I still use > PCSVD? > Yes. > (iv) Can I set ksp type, pc type, ksp monitor etc using > PETScOptionsSetValue instead of code? In that case how would the above code > translate to? That will be very helpful. > Yes, you can use SetValue(). I do not understand the rest of the question. Thanks, Matt > Many thanks. > > > > Best, > > Karthik. > > > > *From: *Matthew Knepley > *Date: *Thursday, 17 November 2022 at 17:48 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *Zhang, Hong , petsc-users at mcs.anl.gov < > petsc-users at mcs.anl.gov> > *Subject: *Re: [petsc-users] Different solution while running in parallel > > On Thu, Nov 17, 2022 at 11:13 AM Karthikeyan Chockalingam - STFC UKRI < > karthikeyan.chockalingam at stfc.ac.uk> wrote: > > Hi Matt, > > > > I tested two sizes manually for the Poisson problem with homogenous > Dirichlet boundary conditions (on all nodes on the boundary) and they both > produced the right result when run serially using PCLU > > > > 1. 2 elements x 2 elements (total nodes 9 but 1 dof) > > A = 10.6667 b = 4 x = 0.375 > > 1. 3 elements x 3 elements (total nodes 16 but 4 dof) > > A = 10.6667 -1.33333 -1.33333 -1.33333 > > -1.33333 10.6667 -1.33333 -1.33333 > > -1.33333 -1.33333 10.6667 -1.33333 > > -1.33333 -1.33333 -1.33333 10.6667 > > > > b = {4 4 4 4}^T > > x = (0.6 0.6 0.6 0.6) > > > > Since, it is solvable not sure if the system can be singular? > > > > I have attached the runs for case (2) run on one and two cores. Parallel > run produces zero vector for x. > > > > I used MatZeroRowsColumns to set the Dirichlet boundary conditions by > zeroing the entries in the matrix corresponding to the boundary nodes. > > > > > > Please please please run the original thing with the options I suggested: > > > > -pc_type svd -pc_svd_monitor > > > > This will print out all the singular values of the matrix and solve it > using SVD. > > > > Thanks, > > > > Matt > > > > Best, > > Karthik. > > > > > > > > *From: *Matthew Knepley > *Date: *Thursday, 17 November 2022 at 15:16 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *Zhang, Hong , petsc-users at mcs.anl.gov < > petsc-users at mcs.anl.gov> > *Subject: *Re: [petsc-users] Different solution while running in parallel > > Using options instead of code will make your life much easier. > > > > Two thing are wrong here: > > > > 1) Your solver is doing no iterates because the initial residual is very > small, 5.493080158227e-15. The LU does not matter. > > In order to check the condition number of your system, run with > -pc_type svd -pc_svd_monitor > > > > 2) Your parallel run also does no iterates > > > > 0 KSP none resid norm 6.951601853367e-310 true resid norm > 1.058300524426e+01 ||r(i)||/||b|| 8.819171036882e-01 > > > > but the true residual is not small. That means that your system is > singular, but you have given a consistent RHS. > > > > Thanks, > > > > Matt > > > > On Thu, Nov 17, 2022 at 9:37 AM Karthikeyan Chockalingam - STFC UKRI < > karthikeyan.chockalingam at stfc.ac.uk> wrote: > > Hi Matt and Hong, > > > > Thank you for your response. > > I made the following changes, to get the desired output > > > > PetscReal norm; /* norm of solution error */ > > PetscInt its; > > KSPConvergedReason reason; > > PetscViewerAndFormat *vf; > > PetscViewerAndFormatCreate(PETSC_VIEWER_STDOUT_WORLD, > PETSC_VIEWER_DEFAULT, &vf); > > > > ierr = KSPView(ksp, PETSC_VIEWER_STDOUT_SELF);CHKERRQ(ierr); > > > > KSPSolve(ksp, b, x); > > > > ierr = KSPMonitorTrueResidual(ksp,its,norm,vf);CHKERRQ(ierr); > > ierr = KSPMonitorSingularValue(ksp, its, norm, vf);CHKERRQ(ierr); > > > > I have attached the outputs from both the runs. As before, I am also > printing A, b, and x. > > > > I wonder if it is a memory issue related to mpi library employed. I am > currently using openmpi ? should I instead use mpich? > > > > Kind regards, > > Karthik. > > > > *From: *Matthew Knepley > *Date: *Thursday, 17 November 2022 at 12:19 > *To: *Zhang, Hong > *Cc: *petsc-users at mcs.anl.gov , Chockalingam, > Karthikeyan (STFC,DL,HC) > *Subject: *Re: [petsc-users] Different solution while running in parallel > > On Wed, Nov 16, 2022 at 9:07 PM Zhang, Hong via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > Karhik, > > Can you find out the condition number of your matrix? > > > > Also, run using > > > > -ksp_view -ksp_monitor_true_residual -ksp_converged_reason > > > > and send the two outputs. > > > > Thanks, > > > > Matt > > > > Hong > > > ------------------------------ > > *From:* petsc-users on behalf of > Karthikeyan Chockalingam - STFC UKRI via petsc-users < > petsc-users at mcs.anl.gov> > *Sent:* Wednesday, November 16, 2022 6:04 PM > *To:* petsc-users at mcs.anl.gov > *Subject:* [petsc-users] Different solution while running in parallel > > > > Hello, > > > > I tried to solve a (FE discretized) Poisson equation using PCLU. For > some reason I am getting different solutions while running the problem on > one and two cores. I have attached the output file (out.txt) from both the > runs. I am printing A, b and x from both the runs ? while A and b are the > same but the solution seems is different. > > > > I am not sure what I doing wrong. > > > > Below is my matrix, vector, and solve setup. > > > > > > Mat A; > > Vec b, x; > > > > ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr); > > ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr); > > ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ > (ierr); > > ierr = MatMPIAIJSetPreallocation(A,d_nz, *NULL*, o_nz, *NULL*); > CHKERRQ(ierr); > > ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE); CHKERRQ(ierr); > > ierr = MatCreateVecs(A, &b, &x); CHKERRQ(ierr); > > > > KSP ksp; > > PC pc; > > KSPCreate(PETSC_COMM_WORLD, &ksp); > > KSPSetOperators(ksp, A, A); > > ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr); > > ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); > > ierr = PCSetType(pc,PCLU);CHKERRQ(ierr); > > ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr); > > KSPSolve(ksp, b, x); > > > > Thank you for your help. > > > > Karhik. > > > > This email and any attachments are intended solely for the use of the > named recipients. If you are not the intended recipient you must not use, > disclose, copy or distribute this email or any of its attachments and > should notify the sender immediately and delete this email from your > system. UK Research and Innovation (UKRI) has taken every reasonable > precaution to minimise risk of this email or any attachments containing > viruses or malware but the recipient should carry out its own virus and > malware checks before opening the attachments. UKRI does not accept any > liability for any losses or damages which the recipient may sustain due to > presence of any viruses. > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From narnoldm at umich.edu Sun Nov 27 21:21:43 2022 From: narnoldm at umich.edu (Nicholas Arnold-Medabalimi) Date: Sun, 27 Nov 2022 22:21:43 -0500 Subject: [petsc-users] Petsc Section in DMPlex Message-ID: Hi Petsc Users I have a question about properly using PetscSection to assign state variables to a DM. I have an existing DMPlex mesh distributed on 2 processors. My goal is to have state variables set to the cell centers. I then want to call DMPlexDistribute, which I hope will balance the mesh elements and hopefully transport the state variables to the hosting processors as the cells are distributed to a different processor count or simply just redistributing after doing mesh adaption. Looking at the DMPlex User guide, I should be able to achieve this with a single field section using SetDof and assigning the DOF to the points corresponding to cells. call DMPlexGetHeightStratum(dm,0,c0,c1,ierr) call DMPlexGetChart(dm,p0,p1,ierr) call PetscSectionCreate(PETSC_COMM_WORLD,section,ierr) call PetscSectionSetNumFields(section,1,ierr) call PetscSectionSetChart(section,p0,p1,ierr) do i = c0, (c1-1) call PetscSectionSetDof(section,i,nvar,ierr) end do call PetscSectionSetup(section,ierr) call DMSetLocalSection(dm,section,ierr) >From here, it looks like I can access and set the state vars using call DMGetGlobalVector(dmplex,state,ierr) call DMGetGlobalSection(dmplex,section,ierr) call VecGetArrayF90(state,stateVec,ierr) do i = c0, (c1-1) call PetscSectionGetOffset(section,i,offset,ierr) stateVec(offset:(offset+nvar))=state_i(:) !simplified assignment end do call VecRestoreArrayF90(state,stateVec,ierr) call DMRestoreGlobalVector(dmplex,state,ierr) To my understanding, I should be using Global vector since this is a pure assignment operation and I don't need the ghost cells. But the behavior I am seeing isn't exactly what I'd expect. To be honest, I'm somewhat unclear on a few things 1) Should be using nvar fields with 1 DOF each or 1 field with nvar DOFs or what the distinction between the two methods are? 2) Adding a print statement after the offset assignment I get (on rank 0 of 2) cell 1 offset 0 cell 2 offset 18 cell 3 offset 36 which is expected and works but on rank 1 I get cell 1 offset 9000 cell 2 offset 9018 cell 3 offset 9036 which isn't exactly what I would expect. Shouldn't the offsets reset at 0 for the next rank? 3) Does calling DMPlexDistribute also distribute the section data associated with the DOF, based on the description in DMPlexDistribute it looks like it should? I'd appreciate any insight into the specifics of this usage. I expect I have a misconception on the local vs global section. Thank you. Sincerely Nicholas -- Nicholas Arnold-Medabalimi Ph.D. Candidate Computational Aeroscience Lab University of Michigan -------------- next part -------------- An HTML attachment was scrubbed... URL: From yuanxi at advancesoft.jp Sun Nov 27 22:29:40 2022 From: yuanxi at advancesoft.jp (=?UTF-8?B?6KKB54WV?=) Date: Mon, 28 Nov 2022 13:29:40 +0900 Subject: [petsc-users] How to introduce new external package into PETSc? In-Reply-To: <00A34A21-4E12-4ED9-B29B-7164221663E5@joliv.et> References: <00A34A21-4E12-4ED9-B29B-7164221663E5@joliv.et> Message-ID: Thanks a lot! I have tried to let PETSc download my source code successfully. 2022?11?25?(?) 22:42 Pierre Jolivet : > > > On 25 Nov 2022, at 10:04 AM, ?? wrote: > > > > Dear PETSc developers, > > > > I have my own linear solver and am trying to put it with PETSc as an > external solver. I wish to know following details > > > > 1. How to let PETSc configure to download (from github) the source code > and compile, install it afterwards. > > You can borrow code from one of the many files in > config/BuildSystem/config/packages/*.py > Put your file in there and then use --download-name-of-your-file when > configuring PETSc. > > > 2. How to let PETSc call the solver. I think it may be accomplished by > calling PCFactorSetMatSolverPackage. Is this correct? > > You can borrow code from one > src/mat/impls/aij/seq/{mkl_pardiso,umfpack,superlu} or > src/mat/impls/aij/mpi/mumps to name a few. > > Thanks, > Pierre > > > Many thanks, > > > > Yuan > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Nov 28 05:18:36 2022 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 28 Nov 2022 06:18:36 -0500 Subject: [petsc-users] Petsc Section in DMPlex In-Reply-To: References: Message-ID: On Sun, Nov 27, 2022 at 10:22 PM Nicholas Arnold-Medabalimi < narnoldm at umich.edu> wrote: > Hi Petsc Users > > I have a question about properly using PetscSection to assign state > variables to a DM. I have an existing DMPlex mesh distributed on 2 > processors. My goal is to have state variables set to the cell centers. I > then want to call DMPlexDistribute, which I hope will balance the mesh > elements and hopefully transport the state variables to the hosting > processors as the cells are distributed to a different processor count or > simply just redistributing after doing mesh adaption. > > Looking at the DMPlex User guide, I should be able to achieve this with a > single field section using SetDof and assigning the DOF to the points > corresponding to cells. > Note that if you want several different fields, you can clone the DM first for this field call DMClone(dm,dmState,ierr) and use dmState in your calls below. > > call DMPlexGetHeightStratum(dm,0,c0,c1,ierr) > call DMPlexGetChart(dm,p0,p1,ierr) > call PetscSectionCreate(PETSC_COMM_WORLD,section,ierr) > call PetscSectionSetNumFields(section,1,ierr) call > PetscSectionSetChart(section,p0,p1,ierr) > do i = c0, (c1-1) > call PetscSectionSetDof(section,i,nvar,ierr) > end do > call PetscSectionSetup(section,ierr) > call DMSetLocalSection(dm,section,ierr) > In the loop, I would add a call to call PetscSectionSetFieldDof(section,i,0,nvar,ierr) This also puts in the field breakdown. It is not essential, but nicer. > From here, it looks like I can access and set the state vars using > > call DMGetGlobalVector(dmplex,state,ierr) > call DMGetGlobalSection(dmplex,section,ierr) > call VecGetArrayF90(state,stateVec,ierr) > do i = c0, (c1-1) > call PetscSectionGetOffset(section,i,offset,ierr) > stateVec(offset:(offset+nvar))=state_i(:) !simplified assignment > end do > call VecRestoreArrayF90(state,stateVec,ierr) > call DMRestoreGlobalVector(dmplex,state,ierr) > > To my understanding, I should be using Global vector since this is a pure > assignment operation and I don't need the ghost cells. > Yes. But the behavior I am seeing isn't exactly what I'd expect. > > To be honest, I'm somewhat unclear on a few things > > 1) Should be using nvar fields with 1 DOF each or 1 field with nvar > DOFs or what the distinction between the two methods are? > We have two divisions in a Section. A field can have a number of components. This is intended to model a vector or tensor field. Then a Section can have a number of fields, such as velocity and pressure for a Stokes problem. The division is mainly to help the user, so I would use the most natural one. > 2) Adding a print statement after the offset assignment I get (on rank 0 > of 2) > cell 1 offset 0 > cell 2 offset 18 > cell 3 offset 36 > which is expected and works but on rank 1 I get > cell 1 offset 9000 > cell 2 offset 9018 > cell 3 offset 9036 > > which isn't exactly what I would expect. Shouldn't the offsets reset at 0 > for the next rank? > The local and global sections hold different information. This is the source of the confusion. The local section does describe a local vector, and thus includes overlap or "ghost" dofs. The global section describes a global vector. However, it is intended to deliver global indices, and thus the offsets give back global indices. When you use VecGetArray*() you are getting out the local array, and thus you have to subtract the first index on this process. You can get that from VecGetOwnershipRange(v, &rstart, &rEnd); This is the same whether you are using DMDA or DMPlex or any other DM. > 3) Does calling DMPlexDistribute also distribute the section data > associated with the DOF, based on the description in DMPlexDistribute it > looks like it should? > No. By default, DMPlexDistribute() only distributes coordinate data. I you want to distribute your field, it would look something like this: DMPlexDistribute(dm, 0, &sfDist, &dmDist); VecCreate(comm, &stateDist); VecSetDM(sateDist, dmDist); PetscSectionCreate(comm §ionDist); DMSetLocalSection(dmDist, sectionDist); DMPlexDistributeField(dmDist, sfDist, section, state, sectionDist, stateDist); We do this in src/dm/impls/plex/tests/ex36.c THanks, Matt I'd appreciate any insight into the specifics of this usage. I expect I > have a misconception on the local vs global section. Thank you. > > Sincerely > Nicholas > > -- > Nicholas Arnold-Medabalimi > > Ph.D. Candidate > Computational Aeroscience Lab > University of Michigan > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Nov 28 07:18:30 2022 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 28 Nov 2022 08:18:30 -0500 Subject: [petsc-users] Different solution while running in parallel In-Reply-To: References: Message-ID: Maybe I am missing something but I don't see MUMPS in the ksp_view. Just LU. The built-in LU is not parallel and so with > 1 MPI process you get one iteration of block Jacobi, so these results look fine to me, at least up to this point. Mark On Thu, Nov 17, 2022 at 9:37 AM Karthikeyan Chockalingam - STFC UKRI via petsc-users wrote: > Hi Matt and Hong, > > > > Thank you for your response. > > I made the following changes, to get the desired output > > > > PetscReal norm; /* norm of solution error */ > > PetscInt its; > > KSPConvergedReason reason; > > PetscViewerAndFormat *vf; > > PetscViewerAndFormatCreate(PETSC_VIEWER_STDOUT_WORLD, > PETSC_VIEWER_DEFAULT, &vf); > > > > ierr = KSPView(ksp, PETSC_VIEWER_STDOUT_SELF);CHKERRQ(ierr); > > > > KSPSolve(ksp, b, x); > > > > ierr = KSPMonitorTrueResidual(ksp,its,norm,vf);CHKERRQ(ierr); > > ierr = KSPMonitorSingularValue(ksp, its, norm, vf);CHKERRQ(ierr); > > > > I have attached the outputs from both the runs. As before, I am also > printing A, b, and x. > > > > I wonder if it is a memory issue related to mpi library employed. I am > currently using openmpi ? should I instead use mpich? > > > > Kind regards, > > Karthik. > > > > *From: *Matthew Knepley > *Date: *Thursday, 17 November 2022 at 12:19 > *To: *Zhang, Hong > *Cc: *petsc-users at mcs.anl.gov , Chockalingam, > Karthikeyan (STFC,DL,HC) > *Subject: *Re: [petsc-users] Different solution while running in parallel > > On Wed, Nov 16, 2022 at 9:07 PM Zhang, Hong via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > Karhik, > > Can you find out the condition number of your matrix? > > > > Also, run using > > > > -ksp_view -ksp_monitor_true_residual -ksp_converged_reason > > > > and send the two outputs. > > > > Thanks, > > > > Matt > > > > Hong > > > ------------------------------ > > *From:* petsc-users on behalf of > Karthikeyan Chockalingam - STFC UKRI via petsc-users < > petsc-users at mcs.anl.gov> > *Sent:* Wednesday, November 16, 2022 6:04 PM > *To:* petsc-users at mcs.anl.gov > *Subject:* [petsc-users] Different solution while running in parallel > > > > Hello, > > > > I tried to solve a (FE discretized) Poisson equation using PCLU. For > some reason I am getting different solutions while running the problem on > one and two cores. I have attached the output file (out.txt) from both the > runs. I am printing A, b and x from both the runs ? while A and b are the > same but the solution seems is different. > > > > I am not sure what I doing wrong. > > > > Below is my matrix, vector, and solve setup. > > > > > > Mat A; > > Vec b, x; > > > > ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr); > > ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr); > > ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ > (ierr); > > ierr = MatMPIAIJSetPreallocation(A,d_nz, *NULL*, o_nz, *NULL*); > CHKERRQ(ierr); > > ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE); CHKERRQ(ierr); > > ierr = MatCreateVecs(A, &b, &x); CHKERRQ(ierr); > > > > KSP ksp; > > PC pc; > > KSPCreate(PETSC_COMM_WORLD, &ksp); > > KSPSetOperators(ksp, A, A); > > ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr); > > ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); > > ierr = PCSetType(pc,PCLU);CHKERRQ(ierr); > > ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr); > > KSPSolve(ksp, b, x); > > > > Thank you for your help. > > > > Karhik. > > > > This email and any attachments are intended solely for the use of the > named recipients. If you are not the intended recipient you must not use, > disclose, copy or distribute this email or any of its attachments and > should notify the sender immediately and delete this email from your > system. UK Research and Innovation (UKRI) has taken every reasonable > precaution to minimise risk of this email or any attachments containing > viruses or malware but the recipient should carry out its own virus and > malware checks before opening the attachments. UKRI does not accept any > liability for any losses or damages which the recipient may sustain due to > presence of any viruses. > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Nov 28 07:24:40 2022 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 28 Nov 2022 08:24:40 -0500 Subject: [petsc-users] Different solution while running in parallel In-Reply-To: References: Message-ID: This is odd: 0 KSP none resid norm 6.951602688343e-310 true resid norm 8.000000000000e+00 ||r(i)||/||b|| 1.000000000000e+00 0 KSP Residual norm 6.951602688343e-310 Maybe there is a bug in the "KSP none resid norm". Try KSPCG Mark On Thu, Nov 17, 2022 at 11:13 AM Karthikeyan Chockalingam - STFC UKRI via petsc-users wrote: > Hi Matt, > > > > I tested two sizes manually for the Poisson problem with homogenous > Dirichlet boundary conditions (on all nodes on the boundary) and they both > produced the right result when run serially using PCLU > > > > 1. 2 elements x 2 elements (total nodes 9 but 1 dof) > > A = 10.6667 b = 4 x = 0.375 > > 1. 3 elements x 3 elements (total nodes 16 but 4 dof) > > A = 10.6667 -1.33333 -1.33333 -1.33333 > > -1.33333 10.6667 -1.33333 -1.33333 > > -1.33333 -1.33333 10.6667 -1.33333 > > -1.33333 -1.33333 -1.33333 10.6667 > > > > b = {4 4 4 4}^T > > x = (0.6 0.6 0.6 0.6) > > > > Since, it is solvable not sure if the system can be singular? > > > > I have attached the runs for case (2) run on one and two cores. Parallel > run produces zero vector for x. > > > > I used MatZeroRowsColumns to set the Dirichlet boundary conditions by > zeroing the entries in the matrix corresponding to the boundary nodes. > > > > Best, > > Karthik. > > > > > > > > *From: *Matthew Knepley > *Date: *Thursday, 17 November 2022 at 15:16 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *Zhang, Hong , petsc-users at mcs.anl.gov < > petsc-users at mcs.anl.gov> > *Subject: *Re: [petsc-users] Different solution while running in parallel > > Using options instead of code will make your life much easier. > > > > Two thing are wrong here: > > > > 1) Your solver is doing no iterates because the initial residual is very > small, 5.493080158227e-15. The LU does not matter. > > In order to check the condition number of your system, run with > -pc_type svd -pc_svd_monitor > > > > 2) Your parallel run also does no iterates > > > > 0 KSP none resid norm 6.951601853367e-310 true resid norm > 1.058300524426e+01 ||r(i)||/||b|| 8.819171036882e-01 > > > > but the true residual is not small. That means that your system is > singular, but you have given a consistent RHS. > > > > Thanks, > > > > Matt > > > > On Thu, Nov 17, 2022 at 9:37 AM Karthikeyan Chockalingam - STFC UKRI < > karthikeyan.chockalingam at stfc.ac.uk> wrote: > > Hi Matt and Hong, > > > > Thank you for your response. > > I made the following changes, to get the desired output > > > > PetscReal norm; /* norm of solution error */ > > PetscInt its; > > KSPConvergedReason reason; > > PetscViewerAndFormat *vf; > > PetscViewerAndFormatCreate(PETSC_VIEWER_STDOUT_WORLD, > PETSC_VIEWER_DEFAULT, &vf); > > > > ierr = KSPView(ksp, PETSC_VIEWER_STDOUT_SELF);CHKERRQ(ierr); > > > > KSPSolve(ksp, b, x); > > > > ierr = KSPMonitorTrueResidual(ksp,its,norm,vf);CHKERRQ(ierr); > > ierr = KSPMonitorSingularValue(ksp, its, norm, vf);CHKERRQ(ierr); > > > > I have attached the outputs from both the runs. As before, I am also > printing A, b, and x. > > > > I wonder if it is a memory issue related to mpi library employed. I am > currently using openmpi ? should I instead use mpich? > > > > Kind regards, > > Karthik. > > > > *From: *Matthew Knepley > *Date: *Thursday, 17 November 2022 at 12:19 > *To: *Zhang, Hong > *Cc: *petsc-users at mcs.anl.gov , Chockalingam, > Karthikeyan (STFC,DL,HC) > *Subject: *Re: [petsc-users] Different solution while running in parallel > > On Wed, Nov 16, 2022 at 9:07 PM Zhang, Hong via petsc-users < > petsc-users at mcs.anl.gov> wrote: > > Karhik, > > Can you find out the condition number of your matrix? > > > > Also, run using > > > > -ksp_view -ksp_monitor_true_residual -ksp_converged_reason > > > > and send the two outputs. > > > > Thanks, > > > > Matt > > > > Hong > > > ------------------------------ > > *From:* petsc-users on behalf of > Karthikeyan Chockalingam - STFC UKRI via petsc-users < > petsc-users at mcs.anl.gov> > *Sent:* Wednesday, November 16, 2022 6:04 PM > *To:* petsc-users at mcs.anl.gov > *Subject:* [petsc-users] Different solution while running in parallel > > > > Hello, > > > > I tried to solve a (FE discretized) Poisson equation using PCLU. For > some reason I am getting different solutions while running the problem on > one and two cores. I have attached the output file (out.txt) from both the > runs. I am printing A, b and x from both the runs ? while A and b are the > same but the solution seems is different. > > > > I am not sure what I doing wrong. > > > > Below is my matrix, vector, and solve setup. > > > > > > Mat A; > > Vec b, x; > > > > ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr); > > ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr); > > ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ > (ierr); > > ierr = MatMPIAIJSetPreallocation(A,d_nz, *NULL*, o_nz, *NULL*); > CHKERRQ(ierr); > > ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE); CHKERRQ(ierr); > > ierr = MatCreateVecs(A, &b, &x); CHKERRQ(ierr); > > > > KSP ksp; > > PC pc; > > KSPCreate(PETSC_COMM_WORLD, &ksp); > > KSPSetOperators(ksp, A, A); > > ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr); > > ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); > > ierr = PCSetType(pc,PCLU);CHKERRQ(ierr); > > ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr); > > KSPSolve(ksp, b, x); > > > > Thank you for your help. > > > > Karhik. > > > > This email and any attachments are intended solely for the use of the > named recipients. If you are not the intended recipient you must not use, > disclose, copy or distribute this email or any of its attachments and > should notify the sender immediately and delete this email from your > system. UK Research and Innovation (UKRI) has taken every reasonable > precaution to minimise risk of this email or any attachments containing > viruses or malware but the recipient should carry out its own virus and > malware checks before opening the attachments. UKRI does not accept any > liability for any losses or damages which the recipient may sustain due to > presence of any viruses. > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From karthikeyan.chockalingam at stfc.ac.uk Mon Nov 28 07:26:03 2022 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Mon, 28 Nov 2022 13:26:03 +0000 Subject: [petsc-users] Different solution while running in parallel In-Reply-To: References: Message-ID: Thank you Mark and Matt for your response. @Mark Adams that sounds right because when I used cg and preconditioned with jacobi it worked in parallel. Best, Karthik. From: Mark Adams Date: Monday, 28 November 2022 at 13:18 To: Chockalingam, Karthikeyan (STFC,DL,HC) Cc: Matthew Knepley , Zhang, Hong , petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Different solution while running in parallel Maybe I am missing something but I don't see MUMPS in the ksp_view. Just LU. The built-in LU is not parallel and so with > 1 MPI process you get one iteration of block Jacobi, so these results look fine to me, at least up to this point. Mark On Thu, Nov 17, 2022 at 9:37 AM Karthikeyan Chockalingam - STFC UKRI via petsc-users > wrote: Hi Matt and Hong, Thank you for your response. I made the following changes, to get the desired output PetscReal norm; /* norm of solution error */ PetscInt its; KSPConvergedReason reason; PetscViewerAndFormat *vf; PetscViewerAndFormatCreate(PETSC_VIEWER_STDOUT_WORLD, PETSC_VIEWER_DEFAULT, &vf); ierr = KSPView(ksp, PETSC_VIEWER_STDOUT_SELF);CHKERRQ(ierr); KSPSolve(ksp, b, x); ierr = KSPMonitorTrueResidual(ksp,its,norm,vf);CHKERRQ(ierr); ierr = KSPMonitorSingularValue(ksp, its, norm, vf);CHKERRQ(ierr); I have attached the outputs from both the runs. As before, I am also printing A, b, and x. I wonder if it is a memory issue related to mpi library employed. I am currently using openmpi ? should I instead use mpich? Kind regards, Karthik. From: Matthew Knepley > Date: Thursday, 17 November 2022 at 12:19 To: Zhang, Hong > Cc: petsc-users at mcs.anl.gov >, Chockalingam, Karthikeyan (STFC,DL,HC) > Subject: Re: [petsc-users] Different solution while running in parallel On Wed, Nov 16, 2022 at 9:07 PM Zhang, Hong via petsc-users > wrote: Karhik, Can you find out the condition number of your matrix? Also, run using -ksp_view -ksp_monitor_true_residual -ksp_converged_reason and send the two outputs. Thanks, Matt Hong ________________________________ From: petsc-users > on behalf of Karthikeyan Chockalingam - STFC UKRI via petsc-users > Sent: Wednesday, November 16, 2022 6:04 PM To: petsc-users at mcs.anl.gov > Subject: [petsc-users] Different solution while running in parallel Hello, I tried to solve a (FE discretized) Poisson equation using PCLU. For some reason I am getting different solutions while running the problem on one and two cores. I have attached the output file (out.txt) from both the runs. I am printing A, b and x from both the runs ? while A and b are the same but the solution seems is different. I am not sure what I doing wrong. Below is my matrix, vector, and solve setup. Mat A; Vec b, x; ierr = MatCreate(PETSC_COMM_WORLD, &A); CHKERRQ(ierr); ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr); ierr = MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, N, N); CHKERRQ(ierr); ierr = MatMPIAIJSetPreallocation(A,d_nz, NULL, o_nz, NULL); CHKERRQ(ierr); ierr = MatSetOption(A,MAT_SYMMETRIC,PETSC_TRUE); CHKERRQ(ierr); ierr = MatCreateVecs(A, &b, &x); CHKERRQ(ierr); KSP ksp; PC pc; KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetOperators(ksp, A, A); ierr = KSPSetType(ksp,KSPPREONLY);CHKERRQ(ierr); ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); ierr = PCSetType(pc,PCLU);CHKERRQ(ierr); ierr = PCFactorSetMatSolverType(pc,MATSOLVERMUMPS);CHKERRQ(ierr); KSPSolve(ksp, b, x); Thank you for your help. Karhik. This email and any attachments are intended solely for the use of the named recipients. If you are not the intended recipient you must not use, disclose, copy or distribute this email or any of its attachments and should notify the sender immediately and delete this email from your system. UK Research and Innovation (UKRI) has taken every reasonable precaution to minimise risk of this email or any attachments containing viruses or malware but the recipient should carry out its own virus and malware checks before opening the attachments. UKRI does not accept any liability for any losses or damages which the recipient may sustain due to presence of any viruses. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bourdin at mcmaster.ca Mon Nov 28 10:09:27 2022 From: bourdin at mcmaster.ca (Blaise Bourdin) Date: Mon, 28 Nov 2022 16:09:27 +0000 Subject: [petsc-users] Postdoctoral openings at McMaster University Message-ID: Dear PETSc users and developers, I am looking for several postdocs for various projects on phase-field models including but not limited to: - Large scale numerical simulation of frack propagation in polycrystals and comparison with experiments - New algorithms and discretization schemes for phase-field fracture - Study of crack nucleation in nominally brittle materials Candidates should have a background in applied mathematics or solid mechanics and familiarity with PETSc. Start dates are flexible but I need to fill at least one position immediately. Interested candidates should reach out to me by email (bourdin at mcmaster.ca ) and attach a vitae and list of publications. Feel free to forward this posting. Blaise Bourdin ? Canada Research Chair in Mathematical and Computational Aspects of Solid Mechanics (Tier 1) Professor, Department of Mathematics & Statistics Hamilton Hall room 409A, McMaster University 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243 From ksi2443 at gmail.com Tue Nov 29 23:52:11 2022 From: ksi2443 at gmail.com (=?UTF-8?B?6rmA7ISx7J21?=) Date: Wed, 30 Nov 2022 14:52:11 +0900 Subject: [petsc-users] About MatMumpsSetIcntl function Message-ID: Hello, I tried to adopt METIS option in MUMPS by using ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' However, there is an error as follows [0]PETSC ERROR: Object is in wrong state [0]PETSC ERROR: Only for factored matrix [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 Tue Nov 29 21:12:41 2022 [0]PETSC ERROR: Configure options -download-mumps -download-scalapack -download-parmetis -download-metis [0]PETSC ERROR: #1 MatMumpsSetIcntl() at /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:149 [0]PETSC ERROR: No PETSc Option Table entries How can I fix this error? Thank you for your help. Hyung Kim -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Wed Nov 30 01:18:02 2022 From: jroman at dsic.upv.es (Jose E. Roman) Date: Wed, 30 Nov 2022 08:18:02 +0100 Subject: [petsc-users] About MatMumpsSetIcntl function In-Reply-To: References: Message-ID: <8AA60F74-2328-4948-9A62-77E94214293A@dsic.upv.es> You have to call PCFactorGetMatrix() first. See any of the examples that use MatMumpsSetIcntl(), for instance https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html Jose > El 30 nov 2022, a las 6:52, ??? escribi?: > > Hello, > > > I tried to adopt METIS option in MUMPS by using > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' > > However, there is an error as follows > > [0]PETSC ERROR: Object is in wrong state > [0]PETSC ERROR: Only for factored matrix > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 Tue Nov 29 21:12:41 2022 > [0]PETSC ERROR: Configure options -download-mumps -download-scalapack -download-parmetis -download-metis > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 > [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:149 > [0]PETSC ERROR: No PETSc Option Table entries > > How can I fix this error? > > Thank you for your help. > > > Hyung Kim From ksi2443 at gmail.com Wed Nov 30 04:07:56 2022 From: ksi2443 at gmail.com (=?UTF-8?B?6rmA7ISx7J21?=) Date: Wed, 30 Nov 2022 19:07:56 +0900 Subject: [petsc-users] Question About Assembly matrix and declaration of KSP & pc Message-ID: Hello, I?m working on FEM using PETSc. As everyone knows, it is necessary to repeatedly solve Ax=B. Regarding this, I have 4 questions. 1. There are many steps for preparing KSPSolve. For example KSPcreate, KSPSetOperators, KSPGetPC, PCSetType, PCFactorSetMatSolverType, KSPSetFromOptions? In Nonlinear FEM, there are repeatedly kspsolve for getting answer vector. Is it correct to do all of the aforementioned processes (KSPcreate, KSPSetOperators ~~~) for each KSPSolve? Or should I declare it only once at the beginning and not call it again? 2. If the answer to question 1 is that it must be repeated every time, should this work be done right before kspsolve, that is, when the global matrix assembly is finished, or is it irrelevant to performance at any time? 3. When performing FEM, local matrices are often scattered in global matrices depending on connectivity. In this case, which is better in terms of performance: adding the values one by one with MatSetValue or adding them all at once with MatSetValues even if they are scattered? 4. I would like to measure the time of each section of the process. Which method is recommended? Thank you for your help. Hyung Kim -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Wed Nov 30 05:37:04 2022 From: mfadams at lbl.gov (Mark Adams) Date: Wed, 30 Nov 2022 06:37:04 -0500 Subject: [petsc-users] Question About Assembly matrix and declaration of KSP & pc In-Reply-To: References: Message-ID: On Wed, Nov 30, 2022 at 5:08 AM ??? wrote: > Hello, > > > > I?m working on FEM using PETSc. > > As everyone knows, it is necessary to repeatedly solve Ax=B. > > Regarding this, I have 4 questions. > > > > 1. There are many steps for preparing KSPSolve. For example > KSPcreate, KSPSetOperators, KSPGetPC, PCSetType, PCFactorSetMatSolverType, > KSPSetFromOptions? > In Nonlinear FEM, there are repeatedly kspsolve for getting answer vector. > Is it correct to do all of the aforementioned processes (KSPcreate, > KSPSetOperators ~~~) for each KSPSolve? Or should I declare it only once at > the beginning and not call it again? > You just do these once at setup but for nonlinear problems KSPSetOperators tells the solver that you have a new matrix and so "matrix setup" work needs to be done. > > 2. If the answer to question 1 is that it must be repeated every > time, should this work be done right before kspsolve, that is, when the > global matrix assembly is finished, or is it irrelevant to performance at > any time? > KSPSetOperators should be set after the new matrix values are set but it might work before. It just sets a pointer to the matrix and flags it as not setup. > > > 3. When performing FEM, local matrices are often scattered in global > matrices depending on connectivity. In this case, which is better in terms > of performance: adding the values one by one with MatSetValue or adding > them all at once with MatSetValues even if they are scattered? > You want to add one element matrix at a time, generally. > > > > > 4. I would like to measure the time of each section of the process. > Which method is recommended? > PETSc methods are timed separateluy, but setup gets folded into KSPSolve unless you call SNESSetUp before the SNES[KSP]Solve. You can add you own timers also https://petsc.org/release/docs/manualpages/Profiling/PetscLogEventRegister/ Mark > > > Thank you for your help. > > > > Hyung Kim > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ksi2443 at gmail.com Wed Nov 30 06:51:10 2022 From: ksi2443 at gmail.com (=?UTF-8?B?6rmA7ISx7J21?=) Date: Wed, 30 Nov 2022 21:51:10 +0900 Subject: [petsc-users] Question About Assembly matrix and declaration of KSP & pc In-Reply-To: References: Message-ID: Thank you for your comments. However, I have more questions. 1. Generally, (KSPCreate, KSPSetOperators, KSPGetPC, PCSetType, PCFactorSetMatSolverType, KSPSetFromOptions ) above functions are should be called after each "MatassemblyEnd??" 2. Though reading the user guide, I don't fully understand under what circumstances the functions mentioned above should be called again. Can you explain when each function should be called? Thanks, Hyung Kim 2022? 11? 30? (?) ?? 8:37, Mark Adams ?? ??: > > > On Wed, Nov 30, 2022 at 5:08 AM ??? wrote: > >> Hello, >> >> >> >> I?m working on FEM using PETSc. >> >> As everyone knows, it is necessary to repeatedly solve Ax=B. >> >> Regarding this, I have 4 questions. >> >> >> >> 1. There are many steps for preparing KSPSolve. For example >> KSPcreate, KSPSetOperators, KSPGetPC, PCSetType, PCFactorSetMatSolverType, >> KSPSetFromOptions? >> In Nonlinear FEM, there are repeatedly kspsolve for getting answer >> vector. Is it correct to do all of the aforementioned processes (KSPcreate, >> KSPSetOperators ~~~) for each KSPSolve? Or should I declare it only once at >> the beginning and not call it again? >> > > You just do these once at setup but for nonlinear problems KSPSetOperators > tells the solver that you have a new matrix and so "matrix setup" work > needs to be done. > > >> >> 2. If the answer to question 1 is that it must be repeated every >> time, should this work be done right before kspsolve, that is, when the >> global matrix assembly is finished, or is it irrelevant to performance at >> any time? >> > > KSPSetOperators should be set after the new matrix values are set but it > might work before. It just sets a pointer to the matrix and flags it as not > setup. > > >> >> >> 3. When performing FEM, local matrices are often scattered in global >> matrices depending on connectivity. In this case, which is better in terms >> of performance: adding the values one by one with MatSetValue or adding >> them all at once with MatSetValues even if they are scattered? >> > > You want to add one element matrix at a time, generally. > > >> >> >> >> >> 4. I would like to measure the time of each section of the process. >> Which method is recommended? >> > > PETSc methods are timed separateluy, but setup gets folded into KSPSolve > unless you call SNESSetUp before the SNES[KSP]Solve. > You can add you own timers also > https://petsc.org/release/docs/manualpages/Profiling/PetscLogEventRegister/ > > Mark > > > >> >> >> Thank you for your help. >> >> >> >> Hyung Kim >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 30 06:59:48 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 30 Nov 2022 07:59:48 -0500 Subject: [petsc-users] Question About Assembly matrix and declaration of KSP & pc In-Reply-To: References: Message-ID: On Wed, Nov 30, 2022 at 7:51 AM ??? wrote: > Thank you for your comments. > However, I have more questions. > > 1. Generally, (KSPCreate, KSPSetOperators, KSPGetPC, PCSetType, > PCFactorSetMatSolverType, KSPSetFromOptions ) > above functions are should be called after each "MatassemblyEnd??" > KSPCreate is called once. You do not need PCSetType, PCFactorSetMatSolverType, KSPSetFromOptions more than once, unless you want to change the solver type. KSPSetOperators is called if you want to change the system matrix. KSPSolve is called when you want to change the rhs. Thanks, Matt > 2. Though reading the user guide, I don't fully understand under what > circumstances the functions mentioned above should be called again. Can you > explain when each function should be called? > > Thanks, > > Hyung Kim > > 2022? 11? 30? (?) ?? 8:37, Mark Adams ?? ??: > >> >> >> On Wed, Nov 30, 2022 at 5:08 AM ??? wrote: >> >>> Hello, >>> >>> >>> >>> I?m working on FEM using PETSc. >>> >>> As everyone knows, it is necessary to repeatedly solve Ax=B. >>> >>> Regarding this, I have 4 questions. >>> >>> >>> >>> 1. There are many steps for preparing KSPSolve. For example >>> KSPcreate, KSPSetOperators, KSPGetPC, PCSetType, PCFactorSetMatSolverType, >>> KSPSetFromOptions? >>> In Nonlinear FEM, there are repeatedly kspsolve for getting answer >>> vector. Is it correct to do all of the aforementioned processes (KSPcreate, >>> KSPSetOperators ~~~) for each KSPSolve? Or should I declare it only once at >>> the beginning and not call it again? >>> >> >> You just do these once at setup but for nonlinear problems >> KSPSetOperators tells the solver that you have a new matrix and so "matrix >> setup" work needs to be done. >> >> >>> >>> 2. If the answer to question 1 is that it must be repeated every >>> time, should this work be done right before kspsolve, that is, when the >>> global matrix assembly is finished, or is it irrelevant to performance at >>> any time? >>> >> >> KSPSetOperators should be set after the new matrix values are set but it >> might work before. It just sets a pointer to the matrix and flags it as not >> setup. >> >> >>> >>> >>> 3. When performing FEM, local matrices are often scattered in >>> global matrices depending on connectivity. In this case, which is better in >>> terms of performance: adding the values one by one with MatSetValue or >>> adding them all at once with MatSetValues even if they are scattered? >>> >> >> You want to add one element matrix at a time, generally. >> >> >>> >>> >>> >>> >>> 4. I would like to measure the time of each section of the process. >>> Which method is recommended? >>> >> >> PETSc methods are timed separateluy, but setup gets folded into KSPSolve >> unless you call SNESSetUp before the SNES[KSP]Solve. >> You can add you own timers also >> https://petsc.org/release/docs/manualpages/Profiling/PetscLogEventRegister/ >> >> Mark >> >> >> >>> >>> >>> Thank you for your help. >>> >>> >>> >>> Hyung Kim >>> >> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From matteo.semplice at uninsubria.it Wed Nov 30 07:22:33 2022 From: matteo.semplice at uninsubria.it (Matteo Semplice) Date: Wed, 30 Nov 2022 14:22:33 +0100 Subject: [petsc-users] localToGlobal with MIN_VALUES ? Message-ID: Hi. In DMLocalToGlobal only INSERT_VALUES or ADD_VALUES appear to be allowed. Is there a way to perform localToGlobal (or localtolocal) communications inserting the minimum value instead? Best ??? Matteo From ksi2443 at gmail.com Wed Nov 30 07:25:16 2022 From: ksi2443 at gmail.com (=?UTF-8?B?6rmA7ISx7J21?=) Date: Wed, 30 Nov 2022 22:25:16 +0900 Subject: [petsc-users] Question About Assembly matrix and declaration of KSP & pc In-Reply-To: References: Message-ID: In your comments, KSPSetOperators is called if you want to change the system matrix. "change the system matrix" means the components of matrix are changed? I mean the values of some components of matrix are changed. Thanks, Hyung Kim 2022? 11? 30? (?) ?? 10:00, Matthew Knepley ?? ??: > On Wed, Nov 30, 2022 at 7:51 AM ??? wrote: > >> Thank you for your comments. >> However, I have more questions. >> >> 1. Generally, (KSPCreate, KSPSetOperators, KSPGetPC, PCSetType, >> PCFactorSetMatSolverType, KSPSetFromOptions ) >> above functions are should be called after each "MatassemblyEnd??" >> > > KSPCreate is called once. > > You do not need PCSetType, PCFactorSetMatSolverType, KSPSetFromOptions > more than once, unless you want to change the solver type. > > KSPSetOperators is called if you want to change the system matrix. > > KSPSolve is called when you want to change the rhs. > > Thanks, > > Matt > > >> 2. Though reading the user guide, I don't fully understand under what >> circumstances the functions mentioned above should be called again. Can you >> explain when each function should be called? >> >> Thanks, >> >> Hyung Kim >> >> 2022? 11? 30? (?) ?? 8:37, Mark Adams ?? ??: >> >>> >>> >>> On Wed, Nov 30, 2022 at 5:08 AM ??? wrote: >>> >>>> Hello, >>>> >>>> >>>> >>>> I?m working on FEM using PETSc. >>>> >>>> As everyone knows, it is necessary to repeatedly solve Ax=B. >>>> >>>> Regarding this, I have 4 questions. >>>> >>>> >>>> >>>> 1. There are many steps for preparing KSPSolve. For example >>>> KSPcreate, KSPSetOperators, KSPGetPC, PCSetType, PCFactorSetMatSolverType, >>>> KSPSetFromOptions? >>>> In Nonlinear FEM, there are repeatedly kspsolve for getting answer >>>> vector. Is it correct to do all of the aforementioned processes (KSPcreate, >>>> KSPSetOperators ~~~) for each KSPSolve? Or should I declare it only once at >>>> the beginning and not call it again? >>>> >>> >>> You just do these once at setup but for nonlinear problems >>> KSPSetOperators tells the solver that you have a new matrix and so "matrix >>> setup" work needs to be done. >>> >>> >>>> >>>> 2. If the answer to question 1 is that it must be repeated every >>>> time, should this work be done right before kspsolve, that is, when the >>>> global matrix assembly is finished, or is it irrelevant to performance at >>>> any time? >>>> >>> >>> KSPSetOperators should be set after the new matrix values are set but it >>> might work before. It just sets a pointer to the matrix and flags it as not >>> setup. >>> >>> >>>> >>>> >>>> 3. When performing FEM, local matrices are often scattered in >>>> global matrices depending on connectivity. In this case, which is better in >>>> terms of performance: adding the values one by one with MatSetValue or >>>> adding them all at once with MatSetValues even if they are scattered? >>>> >>> >>> You want to add one element matrix at a time, generally. >>> >>> >>>> >>>> >>>> >>>> >>>> 4. I would like to measure the time of each section of the >>>> process. Which method is recommended? >>>> >>> >>> PETSc methods are timed separateluy, but setup gets folded into KSPSolve >>> unless you call SNESSetUp before the SNES[KSP]Solve. >>> You can add you own timers also >>> https://petsc.org/release/docs/manualpages/Profiling/PetscLogEventRegister/ >>> >>> Mark >>> >>> >>> >>>> >>>> >>>> Thank you for your help. >>>> >>>> >>>> >>>> Hyung Kim >>>> >>> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 30 07:31:11 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 30 Nov 2022 08:31:11 -0500 Subject: [petsc-users] Question About Assembly matrix and declaration of KSP & pc In-Reply-To: References: Message-ID: On Wed, Nov 30, 2022 at 8:25 AM ??? wrote: > > In your comments, > KSPSetOperators is called if you want to change the system matrix. > > "change the system matrix" means the components of matrix are changed? > I mean the values of some components of matrix are changed. > If you just change values in the matrix, you do not have to call it again. Thanks, Matt > Thanks, > Hyung Kim > > > 2022? 11? 30? (?) ?? 10:00, Matthew Knepley ?? ??: > >> On Wed, Nov 30, 2022 at 7:51 AM ??? wrote: >> >>> Thank you for your comments. >>> However, I have more questions. >>> >>> 1. Generally, (KSPCreate, KSPSetOperators, KSPGetPC, PCSetType, >>> PCFactorSetMatSolverType, KSPSetFromOptions ) >>> above functions are should be called after each "MatassemblyEnd??" >>> >> >> KSPCreate is called once. >> >> You do not need PCSetType, PCFactorSetMatSolverType, KSPSetFromOptions >> more than once, unless you want to change the solver type. >> >> KSPSetOperators is called if you want to change the system matrix. >> >> KSPSolve is called when you want to change the rhs. >> >> Thanks, >> >> Matt >> >> >>> 2. Though reading the user guide, I don't fully understand under what >>> circumstances the functions mentioned above should be called again. Can you >>> explain when each function should be called? >>> >>> Thanks, >>> >>> Hyung Kim >>> >>> 2022? 11? 30? (?) ?? 8:37, Mark Adams ?? ??: >>> >>>> >>>> >>>> On Wed, Nov 30, 2022 at 5:08 AM ??? wrote: >>>> >>>>> Hello, >>>>> >>>>> >>>>> >>>>> I?m working on FEM using PETSc. >>>>> >>>>> As everyone knows, it is necessary to repeatedly solve Ax=B. >>>>> >>>>> Regarding this, I have 4 questions. >>>>> >>>>> >>>>> >>>>> 1. There are many steps for preparing KSPSolve. For example >>>>> KSPcreate, KSPSetOperators, KSPGetPC, PCSetType, PCFactorSetMatSolverType, >>>>> KSPSetFromOptions? >>>>> In Nonlinear FEM, there are repeatedly kspsolve for getting answer >>>>> vector. Is it correct to do all of the aforementioned processes (KSPcreate, >>>>> KSPSetOperators ~~~) for each KSPSolve? Or should I declare it only once at >>>>> the beginning and not call it again? >>>>> >>>> >>>> You just do these once at setup but for nonlinear problems >>>> KSPSetOperators tells the solver that you have a new matrix and so "matrix >>>> setup" work needs to be done. >>>> >>>> >>>>> >>>>> 2. If the answer to question 1 is that it must be repeated every >>>>> time, should this work be done right before kspsolve, that is, when the >>>>> global matrix assembly is finished, or is it irrelevant to performance at >>>>> any time? >>>>> >>>> >>>> KSPSetOperators should be set after the new matrix values are set but >>>> it might work before. It just sets a pointer to the matrix and flags it as >>>> not setup. >>>> >>>> >>>>> >>>>> >>>>> 3. When performing FEM, local matrices are often scattered in >>>>> global matrices depending on connectivity. In this case, which is better in >>>>> terms of performance: adding the values one by one with MatSetValue or >>>>> adding them all at once with MatSetValues even if they are scattered? >>>>> >>>> >>>> You want to add one element matrix at a time, generally. >>>> >>>> >>>>> >>>>> >>>>> >>>>> >>>>> 4. I would like to measure the time of each section of the >>>>> process. Which method is recommended? >>>>> >>>> >>>> PETSc methods are timed separateluy, but setup gets folded into >>>> KSPSolve unless you call SNESSetUp before the SNES[KSP]Solve. >>>> You can add you own timers also >>>> https://petsc.org/release/docs/manualpages/Profiling/PetscLogEventRegister/ >>>> >>>> Mark >>>> >>>> >>>> >>>>> >>>>> >>>>> Thank you for your help. >>>>> >>>>> >>>>> >>>>> Hyung Kim >>>>> >>>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 30 07:35:17 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 30 Nov 2022 08:35:17 -0500 Subject: [petsc-users] localToGlobal with MIN_VALUES ? In-Reply-To: References: Message-ID: On Wed, Nov 30, 2022 at 8:22 AM Matteo Semplice < matteo.semplice at uninsubria.it> wrote: > Hi. > > In DMLocalToGlobal only INSERT_VALUES or ADD_VALUES appear to be allowed. > > Is there a way to perform localToGlobal (or localtolocal) communications > inserting the minimum value instead? > It could be added. What is it for? Thanks, Matt > Best > > Matteo > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ksi2443 at gmail.com Wed Nov 30 07:39:55 2022 From: ksi2443 at gmail.com (=?UTF-8?B?6rmA7ISx7J21?=) Date: Wed, 30 Nov 2022 22:39:55 +0900 Subject: [petsc-users] About MatMumpsSetIcntl function In-Reply-To: <8AA60F74-2328-4948-9A62-77E94214293A@dsic.upv.es> References: <8AA60F74-2328-4948-9A62-77E94214293A@dsic.upv.es> Message-ID: Following your comments, After matrix assembly end, PetscCall(KSPGetPC(ksp,&pc)); PetscCall(KSPSetFromOptions(ksp)); PetscCall(KSPSetUp(ksp)); PetscCall(PCFactorGetMatrix(pc,&xGK)); However there is another error as below. [0]PETSC ERROR: Object is in wrong state [0]PETSC ERROR: Not for factored matrix [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 Wed Nov 30 05:37:52 2022 [0]PETSC ERROR: Configure options -download-mumps -download-scalapack -download-parmetis -download-metis [0]PETSC ERROR: #1 MatZeroEntries() at /home/ksi2443/petsc/src/mat/interface/matrix.c:6024 [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:339 [0]PETSC ERROR: No PETSc Option Table entries How can I fix this? Thanks, Hyung Kim 2022? 11? 30? (?) ?? 4:18, Jose E. Roman ?? ??: > You have to call PCFactorGetMatrix() first. See any of the examples that > use MatMumpsSetIcntl(), for instance > https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html > > Jose > > > > El 30 nov 2022, a las 6:52, ??? escribi?: > > > > Hello, > > > > > > I tried to adopt METIS option in MUMPS by using > > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' > > > > However, there is an error as follows > > > > [0]PETSC ERROR: Object is in wrong state > > [0]PETSC ERROR: Only for factored matrix > > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown > > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 > Tue Nov 29 21:12:41 2022 > > [0]PETSC ERROR: Configure options -download-mumps -download-scalapack > -download-parmetis -download-metis > > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at > /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 > > [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:149 > > [0]PETSC ERROR: No PETSc Option Table entries > > > > How can I fix this error? > > > > Thank you for your help. > > > > > > Hyung Kim > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 30 07:44:03 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 30 Nov 2022 08:44:03 -0500 Subject: [petsc-users] About MatMumpsSetIcntl function In-Reply-To: References: <8AA60F74-2328-4948-9A62-77E94214293A@dsic.upv.es> Message-ID: On Wed, Nov 30, 2022 at 8:40 AM ??? wrote: > Following your comments, > > After matrix assembly end, > PetscCall(KSPGetPC(ksp,&pc)); > PetscCall(KSPSetFromOptions(ksp)); > PetscCall(KSPSetUp(ksp)); > PetscCall(PCFactorGetMatrix(pc,&xGK)); > > However there is another error as below. > [0]PETSC ERROR: Object is in wrong state > [0]PETSC ERROR: Not for factored matrix > The error message is telling you that you cannot alter values in the factored matrix. This is because the direct solvers use their own internal storage formats which we cannot alter, and you should probably not alter either. What are you trying to do? Thanks, Matt > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 Wed > Nov 30 05:37:52 2022 > [0]PETSC ERROR: Configure options -download-mumps -download-scalapack > -download-parmetis -download-metis > [0]PETSC ERROR: #1 MatZeroEntries() at > /home/ksi2443/petsc/src/mat/interface/matrix.c:6024 > [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:339 > [0]PETSC ERROR: No PETSc Option Table entries > > How can I fix this? > > > Thanks, > Hyung Kim > > > 2022? 11? 30? (?) ?? 4:18, Jose E. Roman ?? ??: > >> You have to call PCFactorGetMatrix() first. See any of the examples that >> use MatMumpsSetIcntl(), for instance >> https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html >> >> Jose >> >> >> > El 30 nov 2022, a las 6:52, ??? escribi?: >> > >> > Hello, >> > >> > >> > I tried to adopt METIS option in MUMPS by using >> > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' >> > >> > However, there is an error as follows >> > >> > [0]PETSC ERROR: Object is in wrong state >> > [0]PETSC ERROR: Only for factored matrix >> > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >> shooting. >> > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >> > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 >> Tue Nov 29 21:12:41 2022 >> > [0]PETSC ERROR: Configure options -download-mumps -download-scalapack >> -download-parmetis -download-metis >> > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at >> /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 >> > [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:149 >> > [0]PETSC ERROR: No PETSc Option Table entries >> > >> > How can I fix this error? >> > >> > Thank you for your help. >> > >> > >> > Hyung Kim >> >> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 30 07:50:59 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 30 Nov 2022 08:50:59 -0500 Subject: [petsc-users] About Q3 tensor product Hermite element In-Reply-To: References: Message-ID: On Wed, Oct 5, 2022 at 4:39 PM Duan Junming via petsc-users < petsc-users at mcs.anl.gov> wrote: > Dear all, > I'm sorry. This email got lost in my inbox. > I need to use Q3 tensor product Hermite element in 2D (point value, > gradient, and mixed derivative at 4 vertices in a cell as unknowns). > > Is it available in PETSc FEM module now? I found that only Lagrange > element is available. > We have elements that have moments and evaluations as dofs, so also Hdiv, Hcurl, etc. However, you are right that we do not have derivative dofs builtin. I have created them by hand before. > If not, what is the correct path to implement Q3 tensor product Hermite > element? > > I think I should create my own petscspace and petscdualspace? > You can probably use the PetscSpace polynomial class. You would need a PetscDualSpace that used derivative dofs. When I did this before, I just made an FD approximation with a quadrature rule. Toby might have a nicer way to differentiate the basis now. > Or is there any package that has already provided this? > Firedrake has them, but only on simplices. Thanks, Matt > Thanks for any suggestions! > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 30 07:55:43 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 30 Nov 2022 08:55:43 -0500 Subject: [petsc-users] Loading labels from a hdf5 mesh file in parallel In-Reply-To: References: Message-ID: On Thu, Mar 3, 2022 at 10:33 PM Yi Jiang wrote: > Dear Petsc developers, > We are trying to use the HDF5_XDMF parallel I/O feature to read/write > unstructured meshes. By trying some tests, we found that the topology and > geometry data (i.e., cells and vertices) can be efficiently loaded in a > scalable way, which is very impressive! However, we also found that all > labels (such as `cell sets', `face sets') in the .h5 file are ignored (the > .h5 file was converted from an EXODUSII mesh, by using a PetscViewer with > PETSC_VIEWER_HDF5_XDMF format). Hence, we are wondering, does the latest > Petsc also support to import these labels in parallel? In particular, > we would like to redistribute the mesh after it is parallelly loaded by the > naive partition. If so, could you please show me where to find an example > to learn the techniques? > Thank you very much for devoting the continuous efforts to the community > and keep developing these wonderful features. > Sorry, this mail got lost in my inbox. I believe we have fixed this now, but I would be willing to incorporate any tests that are needed for the operations you want. Thanks, Matt > Best regards, > YJ > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ksi2443 at gmail.com Wed Nov 30 07:58:12 2022 From: ksi2443 at gmail.com (=?UTF-8?B?6rmA7ISx7J21?=) Date: Wed, 30 Nov 2022 22:58:12 +0900 Subject: [petsc-users] About MatMumpsSetIcntl function In-Reply-To: References: <8AA60F74-2328-4948-9A62-77E94214293A@dsic.upv.es> Message-ID: I'm working on FEM. When I used mumps alone, I fount it efficient to use mumps with metis. So my purpose is using MUMPSsolver with METIS. I tried to set metis (by icntl_7 : 5) after global matrix assembly and just before kspsolve. However there is error because of 'pcfactorgetmatrix' and 'matmumpsseticntl'. How can I fix this? Thanks, Hyung Kim 2022? 11? 30? (?) ?? 10:44, Matthew Knepley ?? ??: > On Wed, Nov 30, 2022 at 8:40 AM ??? wrote: > >> Following your comments, >> >> After matrix assembly end, >> PetscCall(KSPGetPC(ksp,&pc)); >> PetscCall(KSPSetFromOptions(ksp)); >> PetscCall(KSPSetUp(ksp)); >> PetscCall(PCFactorGetMatrix(pc,&xGK)); >> >> However there is another error as below. >> [0]PETSC ERROR: Object is in wrong state >> [0]PETSC ERROR: Not for factored matrix >> > > The error message is telling you that you cannot alter values in the > factored matrix. This is because > the direct solvers use their own internal storage formats which we cannot > alter, and you should probably > not alter either. > > What are you trying to do? > > Thanks, > > Matt > > >> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. >> [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >> [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 Wed >> Nov 30 05:37:52 2022 >> [0]PETSC ERROR: Configure options -download-mumps -download-scalapack >> -download-parmetis -download-metis >> [0]PETSC ERROR: #1 MatZeroEntries() at >> /home/ksi2443/petsc/src/mat/interface/matrix.c:6024 >> [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:339 >> [0]PETSC ERROR: No PETSc Option Table entries >> >> How can I fix this? >> >> >> Thanks, >> Hyung Kim >> >> >> 2022? 11? 30? (?) ?? 4:18, Jose E. Roman ?? ??: >> >>> You have to call PCFactorGetMatrix() first. See any of the examples that >>> use MatMumpsSetIcntl(), for instance >>> https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html >>> >>> Jose >>> >>> >>> > El 30 nov 2022, a las 6:52, ??? escribi?: >>> > >>> > Hello, >>> > >>> > >>> > I tried to adopt METIS option in MUMPS by using >>> > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' >>> > >>> > However, there is an error as follows >>> > >>> > [0]PETSC ERROR: Object is in wrong state >>> > [0]PETSC ERROR: Only for factored matrix >>> > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>> shooting. >>> > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>> > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 >>> Tue Nov 29 21:12:41 2022 >>> > [0]PETSC ERROR: Configure options -download-mumps -download-scalapack >>> -download-parmetis -download-metis >>> > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at >>> /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 >>> > [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:149 >>> > [0]PETSC ERROR: No PETSc Option Table entries >>> > >>> > How can I fix this error? >>> > >>> > Thank you for your help. >>> > >>> > >>> > Hyung Kim >>> >>> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 30 08:03:52 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 30 Nov 2022 09:03:52 -0500 Subject: [petsc-users] About MatMumpsSetIcntl function In-Reply-To: References: <8AA60F74-2328-4948-9A62-77E94214293A@dsic.upv.es> Message-ID: On Wed, Nov 30, 2022 at 8:58 AM ??? wrote: > I'm working on FEM. > When I used mumps alone, I fount it efficient to use mumps with metis. > So my purpose is using MUMPSsolver with METIS. > > I tried to set metis (by icntl_7 : 5) after global matrix assembly and > just before kspsolve. > However there is error because of 'pcfactorgetmatrix' and > 'matmumpsseticntl'. > > How can I fix this? > Give the Icntrl as an option. Thanks, Matt > Thanks, > Hyung Kim > > 2022? 11? 30? (?) ?? 10:44, Matthew Knepley ?? ??: > >> On Wed, Nov 30, 2022 at 8:40 AM ??? wrote: >> >>> Following your comments, >>> >>> After matrix assembly end, >>> PetscCall(KSPGetPC(ksp,&pc)); >>> PetscCall(KSPSetFromOptions(ksp)); >>> PetscCall(KSPSetUp(ksp)); >>> PetscCall(PCFactorGetMatrix(pc,&xGK)); >>> >>> However there is another error as below. >>> [0]PETSC ERROR: Object is in wrong state >>> [0]PETSC ERROR: Not for factored matrix >>> >> >> The error message is telling you that you cannot alter values in the >> factored matrix. This is because >> the direct solvers use their own internal storage formats which we cannot >> alter, and you should probably >> not alter either. >> >> What are you trying to do? >> >> Thanks, >> >> Matt >> >> >>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. >>> [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>> [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 >>> Wed Nov 30 05:37:52 2022 >>> [0]PETSC ERROR: Configure options -download-mumps -download-scalapack >>> -download-parmetis -download-metis >>> [0]PETSC ERROR: #1 MatZeroEntries() at >>> /home/ksi2443/petsc/src/mat/interface/matrix.c:6024 >>> [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:339 >>> [0]PETSC ERROR: No PETSc Option Table entries >>> >>> How can I fix this? >>> >>> >>> Thanks, >>> Hyung Kim >>> >>> >>> 2022? 11? 30? (?) ?? 4:18, Jose E. Roman ?? ??: >>> >>>> You have to call PCFactorGetMatrix() first. See any of the examples >>>> that use MatMumpsSetIcntl(), for instance >>>> https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html >>>> >>>> Jose >>>> >>>> >>>> > El 30 nov 2022, a las 6:52, ??? escribi?: >>>> > >>>> > Hello, >>>> > >>>> > >>>> > I tried to adopt METIS option in MUMPS by using >>>> > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' >>>> > >>>> > However, there is an error as follows >>>> > >>>> > [0]PETSC ERROR: Object is in wrong state >>>> > [0]PETSC ERROR: Only for factored matrix >>>> > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>> shooting. >>>> > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>> > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 >>>> Tue Nov 29 21:12:41 2022 >>>> > [0]PETSC ERROR: Configure options -download-mumps -download-scalapack >>>> -download-parmetis -download-metis >>>> > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at >>>> /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 >>>> > [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:149 >>>> > [0]PETSC ERROR: No PETSc Option Table entries >>>> > >>>> > How can I fix this error? >>>> > >>>> > Thank you for your help. >>>> > >>>> > >>>> > Hyung Kim >>>> >>>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ksi2443 at gmail.com Wed Nov 30 08:10:36 2022 From: ksi2443 at gmail.com (=?UTF-8?B?6rmA7ISx7J21?=) Date: Wed, 30 Nov 2022 23:10:36 +0900 Subject: [petsc-users] About MatMumpsSetIcntl function In-Reply-To: References: <8AA60F74-2328-4948-9A62-77E94214293A@dsic.upv.es> Message-ID: When I adopt icntl by using option, the outputs are as below. WARNING! There are options you set that were not used! WARNING! could be spelling mistake, etc! There is one unused database option. It is: Option left: name:-mat_mumps_icntl_7 value: 5 Is it work?? Thanks, Hyung Kim 2022? 11? 30? (?) ?? 11:04, Matthew Knepley ?? ??: > On Wed, Nov 30, 2022 at 8:58 AM ??? wrote: > >> I'm working on FEM. >> When I used mumps alone, I fount it efficient to use mumps with metis. >> So my purpose is using MUMPSsolver with METIS. >> >> I tried to set metis (by icntl_7 : 5) after global matrix assembly and >> just before kspsolve. >> However there is error because of 'pcfactorgetmatrix' and >> 'matmumpsseticntl'. >> >> How can I fix this? >> > > Give the Icntrl as an option. > > Thanks, > > Matt > > >> Thanks, >> Hyung Kim >> >> 2022? 11? 30? (?) ?? 10:44, Matthew Knepley ?? ??: >> >>> On Wed, Nov 30, 2022 at 8:40 AM ??? wrote: >>> >>>> Following your comments, >>>> >>>> After matrix assembly end, >>>> PetscCall(KSPGetPC(ksp,&pc)); >>>> PetscCall(KSPSetFromOptions(ksp)); >>>> PetscCall(KSPSetUp(ksp)); >>>> PetscCall(PCFactorGetMatrix(pc,&xGK)); >>>> >>>> However there is another error as below. >>>> [0]PETSC ERROR: Object is in wrong state >>>> [0]PETSC ERROR: Not for factored matrix >>>> >>> >>> The error message is telling you that you cannot alter values in the >>> factored matrix. This is because >>> the direct solvers use their own internal storage formats which we >>> cannot alter, and you should probably >>> not alter either. >>> >>> What are you trying to do? >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>> shooting. >>>> [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>> [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 >>>> Wed Nov 30 05:37:52 2022 >>>> [0]PETSC ERROR: Configure options -download-mumps -download-scalapack >>>> -download-parmetis -download-metis >>>> [0]PETSC ERROR: #1 MatZeroEntries() at >>>> /home/ksi2443/petsc/src/mat/interface/matrix.c:6024 >>>> [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:339 >>>> [0]PETSC ERROR: No PETSc Option Table entries >>>> >>>> How can I fix this? >>>> >>>> >>>> Thanks, >>>> Hyung Kim >>>> >>>> >>>> 2022? 11? 30? (?) ?? 4:18, Jose E. Roman ?? ??: >>>> >>>>> You have to call PCFactorGetMatrix() first. See any of the examples >>>>> that use MatMumpsSetIcntl(), for instance >>>>> https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html >>>>> >>>>> Jose >>>>> >>>>> >>>>> > El 30 nov 2022, a las 6:52, ??? escribi?: >>>>> > >>>>> > Hello, >>>>> > >>>>> > >>>>> > I tried to adopt METIS option in MUMPS by using >>>>> > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' >>>>> > >>>>> > However, there is an error as follows >>>>> > >>>>> > [0]PETSC ERROR: Object is in wrong state >>>>> > [0]PETSC ERROR: Only for factored matrix >>>>> > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>>> shooting. >>>>> > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>> > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by >>>>> ksi2443 Tue Nov 29 21:12:41 2022 >>>>> > [0]PETSC ERROR: Configure options -download-mumps >>>>> -download-scalapack -download-parmetis -download-metis >>>>> > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at >>>>> /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 >>>>> > [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:149 >>>>> > [0]PETSC ERROR: No PETSc Option Table entries >>>>> > >>>>> > How can I fix this error? >>>>> > >>>>> > Thank you for your help. >>>>> > >>>>> > >>>>> > Hyung Kim >>>>> >>>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 30 08:16:07 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 30 Nov 2022 09:16:07 -0500 Subject: [petsc-users] About MatMumpsSetIcntl function In-Reply-To: References: <8AA60F74-2328-4948-9A62-77E94214293A@dsic.upv.es> Message-ID: On Wed, Nov 30, 2022 at 9:10 AM ??? wrote: > When I adopt icntl by using option, the outputs are as below. > > WARNING! There are options you set that were not used! > WARNING! could be spelling mistake, etc! > There is one unused database option. It is: > Option left: name:-mat_mumps_icntl_7 value: 5 > > Is it work?? > Are you calling KSPSetFromOptions() after the PC is created? -pc_type lu -pc_factor_mat_solver_type mumps -mat_mumps_icntl_7 3 Thanks, Matt > Thanks, > Hyung Kim > > 2022? 11? 30? (?) ?? 11:04, Matthew Knepley ?? ??: > >> On Wed, Nov 30, 2022 at 8:58 AM ??? wrote: >> >>> I'm working on FEM. >>> When I used mumps alone, I fount it efficient to use mumps with >>> metis. >>> So my purpose is using MUMPSsolver with METIS. >>> >>> I tried to set metis (by icntl_7 : 5) after global matrix assembly and >>> just before kspsolve. >>> However there is error because of 'pcfactorgetmatrix' and >>> 'matmumpsseticntl'. >>> >>> How can I fix this? >>> >> >> Give the Icntrl as an option. >> >> Thanks, >> >> Matt >> >> >>> Thanks, >>> Hyung Kim >>> >>> 2022? 11? 30? (?) ?? 10:44, Matthew Knepley ?? ??: >>> >>>> On Wed, Nov 30, 2022 at 8:40 AM ??? wrote: >>>> >>>>> Following your comments, >>>>> >>>>> After matrix assembly end, >>>>> PetscCall(KSPGetPC(ksp,&pc)); >>>>> PetscCall(KSPSetFromOptions(ksp)); >>>>> PetscCall(KSPSetUp(ksp)); >>>>> PetscCall(PCFactorGetMatrix(pc,&xGK)); >>>>> >>>>> However there is another error as below. >>>>> [0]PETSC ERROR: Object is in wrong state >>>>> [0]PETSC ERROR: Not for factored matrix >>>>> >>>> >>>> The error message is telling you that you cannot alter values in the >>>> factored matrix. This is because >>>> the direct solvers use their own internal storage formats which we >>>> cannot alter, and you should probably >>>> not alter either. >>>> >>>> What are you trying to do? >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>>> shooting. >>>>> [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>> [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 >>>>> Wed Nov 30 05:37:52 2022 >>>>> [0]PETSC ERROR: Configure options -download-mumps -download-scalapack >>>>> -download-parmetis -download-metis >>>>> [0]PETSC ERROR: #1 MatZeroEntries() at >>>>> /home/ksi2443/petsc/src/mat/interface/matrix.c:6024 >>>>> [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:339 >>>>> [0]PETSC ERROR: No PETSc Option Table entries >>>>> >>>>> How can I fix this? >>>>> >>>>> >>>>> Thanks, >>>>> Hyung Kim >>>>> >>>>> >>>>> 2022? 11? 30? (?) ?? 4:18, Jose E. Roman ?? ??: >>>>> >>>>>> You have to call PCFactorGetMatrix() first. See any of the examples >>>>>> that use MatMumpsSetIcntl(), for instance >>>>>> https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html >>>>>> >>>>>> Jose >>>>>> >>>>>> >>>>>> > El 30 nov 2022, a las 6:52, ??? escribi?: >>>>>> > >>>>>> > Hello, >>>>>> > >>>>>> > >>>>>> > I tried to adopt METIS option in MUMPS by using >>>>>> > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' >>>>>> > >>>>>> > However, there is an error as follows >>>>>> > >>>>>> > [0]PETSC ERROR: Object is in wrong state >>>>>> > [0]PETSC ERROR: Only for factored matrix >>>>>> > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>>>> shooting. >>>>>> > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>>> > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by >>>>>> ksi2443 Tue Nov 29 21:12:41 2022 >>>>>> > [0]PETSC ERROR: Configure options -download-mumps >>>>>> -download-scalapack -download-parmetis -download-metis >>>>>> > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at >>>>>> /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 >>>>>> > [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:149 >>>>>> > [0]PETSC ERROR: No PETSc Option Table entries >>>>>> > >>>>>> > How can I fix this error? >>>>>> > >>>>>> > Thank you for your help. >>>>>> > >>>>>> > >>>>>> > Hyung Kim >>>>>> >>>>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 30 08:26:23 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 30 Nov 2022 09:26:23 -0500 Subject: [petsc-users] About MatMumpsSetIcntl function In-Reply-To: References: <8AA60F74-2328-4948-9A62-77E94214293A@dsic.upv.es> Message-ID: On Wed, Nov 30, 2022 at 9:20 AM ??? wrote: > In my code there are below. > PetscCall(KSPCreate(PETSC_COMM_WORLD, &ksp)); > PetscCall(KSPSetOperators(ksp, xGK, xGK)); > PetscCall(KSPGetPC(ksp, &pc)); > PetscCall(PCSetType(pc, PCLU)); > PetscCall(PCFactorSetMatSolverType(pc, MATSOLVERMUMPS)); > PetscCall(KSPSetFromOptions(ksp)); > > and my runtime options are as below. > mpirun -np 3 ./app -mpi_linear_solver_server > -mpi_linear_solver_server_view -pc_type mpi -ksp_type preonly > -mpi_ksp_monitor -mpi_ksp_converged_reason -mpi_pc_type lu > -pc_mpi_always_use_server -mat_mumps_icntl_7 5 > 1) Get rid of the all server stuff until we see what is wrong with your code 2) Always run in serial until it works ./app -pc_type lu -ksp_type preonly -ksp_monitor_true_residual -ksp_converged_reason -ksp_view -mat_mumps_icntl_7 5 Send the output so we can see what the solver is. Thanks, Matt 2022? 11? 30? (?) ?? 11:16, Matthew Knepley ?? ??: > >> On Wed, Nov 30, 2022 at 9:10 AM ??? wrote: >> >>> When I adopt icntl by using option, the outputs are as below. >>> >>> WARNING! There are options you set that were not used! >>> WARNING! could be spelling mistake, etc! >>> There is one unused database option. It is: >>> Option left: name:-mat_mumps_icntl_7 value: 5 >>> >>> Is it work?? >>> >> >> Are you calling KSPSetFromOptions() after the PC is created? >> >> -pc_type lu -pc_factor_mat_solver_type mumps -mat_mumps_icntl_7 3 >> >> Thanks, >> >> Matt >> >> >>> Thanks, >>> Hyung Kim >>> >>> 2022? 11? 30? (?) ?? 11:04, Matthew Knepley ?? ??: >>> >>>> On Wed, Nov 30, 2022 at 8:58 AM ??? wrote: >>>> >>>>> I'm working on FEM. >>>>> When I used mumps alone, I fount it efficient to use mumps with >>>>> metis. >>>>> So my purpose is using MUMPSsolver with METIS. >>>>> >>>>> I tried to set metis (by icntl_7 : 5) after global matrix assembly and >>>>> just before kspsolve. >>>>> However there is error because of 'pcfactorgetmatrix' and >>>>> 'matmumpsseticntl'. >>>>> >>>>> How can I fix this? >>>>> >>>> >>>> Give the Icntrl as an option. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> Thanks, >>>>> Hyung Kim >>>>> >>>>> 2022? 11? 30? (?) ?? 10:44, Matthew Knepley ?? ??: >>>>> >>>>>> On Wed, Nov 30, 2022 at 8:40 AM ??? wrote: >>>>>> >>>>>>> Following your comments, >>>>>>> >>>>>>> After matrix assembly end, >>>>>>> PetscCall(KSPGetPC(ksp,&pc)); >>>>>>> PetscCall(KSPSetFromOptions(ksp)); >>>>>>> PetscCall(KSPSetUp(ksp)); >>>>>>> PetscCall(PCFactorGetMatrix(pc,&xGK)); >>>>>>> >>>>>>> However there is another error as below. >>>>>>> [0]PETSC ERROR: Object is in wrong state >>>>>>> [0]PETSC ERROR: Not for factored matrix >>>>>>> >>>>>> >>>>>> The error message is telling you that you cannot alter values in the >>>>>> factored matrix. This is because >>>>>> the direct solvers use their own internal storage formats which we >>>>>> cannot alter, and you should probably >>>>>> not alter either. >>>>>> >>>>>> What are you trying to do? >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>>>>> shooting. >>>>>>> [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>>>> [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by >>>>>>> ksi2443 Wed Nov 30 05:37:52 2022 >>>>>>> [0]PETSC ERROR: Configure options -download-mumps >>>>>>> -download-scalapack -download-parmetis -download-metis >>>>>>> [0]PETSC ERROR: #1 MatZeroEntries() at >>>>>>> /home/ksi2443/petsc/src/mat/interface/matrix.c:6024 >>>>>>> [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:339 >>>>>>> [0]PETSC ERROR: No PETSc Option Table entries >>>>>>> >>>>>>> How can I fix this? >>>>>>> >>>>>>> >>>>>>> Thanks, >>>>>>> Hyung Kim >>>>>>> >>>>>>> >>>>>>> 2022? 11? 30? (?) ?? 4:18, Jose E. Roman ?? ??: >>>>>>> >>>>>>>> You have to call PCFactorGetMatrix() first. See any of the examples >>>>>>>> that use MatMumpsSetIcntl(), for instance >>>>>>>> https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html >>>>>>>> >>>>>>>> Jose >>>>>>>> >>>>>>>> >>>>>>>> > El 30 nov 2022, a las 6:52, ??? escribi?: >>>>>>>> > >>>>>>>> > Hello, >>>>>>>> > >>>>>>>> > >>>>>>>> > I tried to adopt METIS option in MUMPS by using >>>>>>>> > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' >>>>>>>> > >>>>>>>> > However, there is an error as follows >>>>>>>> > >>>>>>>> > [0]PETSC ERROR: Object is in wrong state >>>>>>>> > [0]PETSC ERROR: Only for factored matrix >>>>>>>> > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>>>>>> shooting. >>>>>>>> > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>>>>> > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by >>>>>>>> ksi2443 Tue Nov 29 21:12:41 2022 >>>>>>>> > [0]PETSC ERROR: Configure options -download-mumps >>>>>>>> -download-scalapack -download-parmetis -download-metis >>>>>>>> > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at >>>>>>>> /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 >>>>>>>> > [0]PETSC ERROR: #2 main() at >>>>>>>> /home/ksi2443/Downloads/coding/a1.c:149 >>>>>>>> > [0]PETSC ERROR: No PETSc Option Table entries >>>>>>>> > >>>>>>>> > How can I fix this error? >>>>>>>> > >>>>>>>> > Thank you for your help. >>>>>>>> > >>>>>>>> > >>>>>>>> > Hyung Kim >>>>>>>> >>>>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>> >>>>>> >>>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ksi2443 at gmail.com Wed Nov 30 08:20:13 2022 From: ksi2443 at gmail.com (=?UTF-8?B?6rmA7ISx7J21?=) Date: Wed, 30 Nov 2022 23:20:13 +0900 Subject: [petsc-users] About MatMumpsSetIcntl function In-Reply-To: References: <8AA60F74-2328-4948-9A62-77E94214293A@dsic.upv.es> Message-ID: In my code there are below. PetscCall(KSPCreate(PETSC_COMM_WORLD, &ksp)); PetscCall(KSPSetOperators(ksp, xGK, xGK)); PetscCall(KSPGetPC(ksp, &pc)); PetscCall(PCSetType(pc, PCLU)); PetscCall(PCFactorSetMatSolverType(pc, MATSOLVERMUMPS)); PetscCall(KSPSetFromOptions(ksp)); and my runtime options are as below. mpirun -np 3 ./app -mpi_linear_solver_server -mpi_linear_solver_server_view -pc_type mpi -ksp_type preonly -mpi_ksp_monitor -mpi_ksp_converged_reason -mpi_pc_type lu -pc_mpi_always_use_server -mat_mumps_icntl_7 5 2022? 11? 30? (?) ?? 11:16, Matthew Knepley ?? ??: > On Wed, Nov 30, 2022 at 9:10 AM ??? wrote: > >> When I adopt icntl by using option, the outputs are as below. >> >> WARNING! There are options you set that were not used! >> WARNING! could be spelling mistake, etc! >> There is one unused database option. It is: >> Option left: name:-mat_mumps_icntl_7 value: 5 >> >> Is it work?? >> > > Are you calling KSPSetFromOptions() after the PC is created? > > -pc_type lu -pc_factor_mat_solver_type mumps -mat_mumps_icntl_7 3 > > Thanks, > > Matt > > >> Thanks, >> Hyung Kim >> >> 2022? 11? 30? (?) ?? 11:04, Matthew Knepley ?? ??: >> >>> On Wed, Nov 30, 2022 at 8:58 AM ??? wrote: >>> >>>> I'm working on FEM. >>>> When I used mumps alone, I fount it efficient to use mumps with >>>> metis. >>>> So my purpose is using MUMPSsolver with METIS. >>>> >>>> I tried to set metis (by icntl_7 : 5) after global matrix assembly and >>>> just before kspsolve. >>>> However there is error because of 'pcfactorgetmatrix' and >>>> 'matmumpsseticntl'. >>>> >>>> How can I fix this? >>>> >>> >>> Give the Icntrl as an option. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> Thanks, >>>> Hyung Kim >>>> >>>> 2022? 11? 30? (?) ?? 10:44, Matthew Knepley ?? ??: >>>> >>>>> On Wed, Nov 30, 2022 at 8:40 AM ??? wrote: >>>>> >>>>>> Following your comments, >>>>>> >>>>>> After matrix assembly end, >>>>>> PetscCall(KSPGetPC(ksp,&pc)); >>>>>> PetscCall(KSPSetFromOptions(ksp)); >>>>>> PetscCall(KSPSetUp(ksp)); >>>>>> PetscCall(PCFactorGetMatrix(pc,&xGK)); >>>>>> >>>>>> However there is another error as below. >>>>>> [0]PETSC ERROR: Object is in wrong state >>>>>> [0]PETSC ERROR: Not for factored matrix >>>>>> >>>>> >>>>> The error message is telling you that you cannot alter values in the >>>>> factored matrix. This is because >>>>> the direct solvers use their own internal storage formats which we >>>>> cannot alter, and you should probably >>>>> not alter either. >>>>> >>>>> What are you trying to do? >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>>>> shooting. >>>>>> [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>>> [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 >>>>>> Wed Nov 30 05:37:52 2022 >>>>>> [0]PETSC ERROR: Configure options -download-mumps -download-scalapack >>>>>> -download-parmetis -download-metis >>>>>> [0]PETSC ERROR: #1 MatZeroEntries() at >>>>>> /home/ksi2443/petsc/src/mat/interface/matrix.c:6024 >>>>>> [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:339 >>>>>> [0]PETSC ERROR: No PETSc Option Table entries >>>>>> >>>>>> How can I fix this? >>>>>> >>>>>> >>>>>> Thanks, >>>>>> Hyung Kim >>>>>> >>>>>> >>>>>> 2022? 11? 30? (?) ?? 4:18, Jose E. Roman ?? ??: >>>>>> >>>>>>> You have to call PCFactorGetMatrix() first. See any of the examples >>>>>>> that use MatMumpsSetIcntl(), for instance >>>>>>> https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html >>>>>>> >>>>>>> Jose >>>>>>> >>>>>>> >>>>>>> > El 30 nov 2022, a las 6:52, ??? escribi?: >>>>>>> > >>>>>>> > Hello, >>>>>>> > >>>>>>> > >>>>>>> > I tried to adopt METIS option in MUMPS by using >>>>>>> > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' >>>>>>> > >>>>>>> > However, there is an error as follows >>>>>>> > >>>>>>> > [0]PETSC ERROR: Object is in wrong state >>>>>>> > [0]PETSC ERROR: Only for factored matrix >>>>>>> > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>>>>> shooting. >>>>>>> > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>>>> > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by >>>>>>> ksi2443 Tue Nov 29 21:12:41 2022 >>>>>>> > [0]PETSC ERROR: Configure options -download-mumps >>>>>>> -download-scalapack -download-parmetis -download-metis >>>>>>> > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at >>>>>>> /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 >>>>>>> > [0]PETSC ERROR: #2 main() at >>>>>>> /home/ksi2443/Downloads/coding/a1.c:149 >>>>>>> > [0]PETSC ERROR: No PETSc Option Table entries >>>>>>> > >>>>>>> > How can I fix this error? >>>>>>> > >>>>>>> > Thank you for your help. >>>>>>> > >>>>>>> > >>>>>>> > Hyung Kim >>>>>>> >>>>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ksi2443 at gmail.com Wed Nov 30 08:31:25 2022 From: ksi2443 at gmail.com (=?UTF-8?B?6rmA7ISx7J21?=) Date: Wed, 30 Nov 2022 23:31:25 +0900 Subject: [petsc-users] About MatMumpsSetIcntl function In-Reply-To: References: <8AA60F74-2328-4948-9A62-77E94214293A@dsic.upv.es> Message-ID: After folloing the comment, ./app -pc_type lu -ksp_type preonly -ksp_monitor_true_residual -ksp_converged_reason -ksp_view -mat_mumps_icntl_7 5 The outputs are as below. 0 KSP none resid norm 2.000000000000e+00 true resid norm 4.241815708566e-16 ||r(i)||/||b|| 2.120907854283e-16 1 KSP none resid norm 4.241815708566e-16 true resid norm 4.241815708566e-16 ||r(i)||/||b|| 2.120907854283e-16 Linear solve converged due to CONVERGED_ITS iterations 1 KSP Object: 1 MPI process type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: 1 MPI process type: lu out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: external factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI process type: mumps rows=24, cols=24 package used to perform factorization: mumps total: nonzeros=576, allocated nonzeros=576 MUMPS run parameters: Use -ksp_view ::ascii_info_detail to display information for all processes RINFOG(1) (global estimated flops for the elimination after analysis): 8924. RINFOG(2) (global estimated flops for the assembly after factorization): 0. RINFOG(3) (global estimated flops for the elimination after factorization): 8924. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 576 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 68 INFOG(5) (estimated maximum front size in the complete tree): 24 INFOG(6) (number of nodes in the complete tree): 1 INFOG(7) (ordering option effectively used after analysis): 5 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 576 INFOG(10) (total integer space store the matrix factors after factorization): 68 INFOG(11) (order of largest frontal matrix after factorization): 24 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 0 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 0 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 0 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 0 INFOG(20) (estimated number of entries in the factors): 576 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 0 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 0 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 576 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 INFOG(35) (after factorization: number of entries taking into account BLR factor compression - sum over all processors): 576 INFOG(36) (after analysis: estimated size of all MUMPS internal data for running BLR in-core - value on the most memory consuming processor): 0 INFOG(37) (after analysis: estimated size of all MUMPS internal data for running BLR in-core - sum over all processors): 0 INFOG(38) (after analysis: estimated size of all MUMPS internal data for running BLR out-of-core - value on the most memory consuming processor): 0 INFOG(39) (after analysis: estimated size of all MUMPS internal data for running BLR out-of-core - sum over all processors): 0 linear system matrix = precond matrix: Mat Object: 1 MPI process type: seqaij rows=24, cols=24 total: nonzeros=576, allocated nonzeros=840 total number of mallocs used during MatSetValues calls=48 using I-node routines: found 5 nodes, limit used is 5 2022? 11? 30? (?) ?? 11:26, Matthew Knepley ?? ??: > On Wed, Nov 30, 2022 at 9:20 AM ??? wrote: > >> In my code there are below. >> PetscCall(KSPCreate(PETSC_COMM_WORLD, &ksp)); >> PetscCall(KSPSetOperators(ksp, xGK, xGK)); >> PetscCall(KSPGetPC(ksp, &pc)); >> PetscCall(PCSetType(pc, PCLU)); >> PetscCall(PCFactorSetMatSolverType(pc, MATSOLVERMUMPS)); >> PetscCall(KSPSetFromOptions(ksp)); >> >> and my runtime options are as below. >> mpirun -np 3 ./app -mpi_linear_solver_server >> -mpi_linear_solver_server_view -pc_type mpi -ksp_type preonly >> -mpi_ksp_monitor -mpi_ksp_converged_reason -mpi_pc_type lu >> -pc_mpi_always_use_server -mat_mumps_icntl_7 5 >> > > 1) Get rid of the all server stuff until we see what is wrong with your > code > > 2) Always run in serial until it works > > ./app -pc_type lu -ksp_type preonly -ksp_monitor_true_residual > -ksp_converged_reason -ksp_view -mat_mumps_icntl_7 5 > > Send the output so we can see what the solver is. > > Thanks, > > Matt > > 2022? 11? 30? (?) ?? 11:16, Matthew Knepley ?? ??: >> >>> On Wed, Nov 30, 2022 at 9:10 AM ??? wrote: >>> >>>> When I adopt icntl by using option, the outputs are as below. >>>> >>>> WARNING! There are options you set that were not used! >>>> WARNING! could be spelling mistake, etc! >>>> There is one unused database option. It is: >>>> Option left: name:-mat_mumps_icntl_7 value: 5 >>>> >>>> Is it work?? >>>> >>> >>> Are you calling KSPSetFromOptions() after the PC is created? >>> >>> -pc_type lu -pc_factor_mat_solver_type mumps -mat_mumps_icntl_7 3 >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> Thanks, >>>> Hyung Kim >>>> >>>> 2022? 11? 30? (?) ?? 11:04, Matthew Knepley ?? ??: >>>> >>>>> On Wed, Nov 30, 2022 at 8:58 AM ??? wrote: >>>>> >>>>>> I'm working on FEM. >>>>>> When I used mumps alone, I fount it efficient to use mumps with >>>>>> metis. >>>>>> So my purpose is using MUMPSsolver with METIS. >>>>>> >>>>>> I tried to set metis (by icntl_7 : 5) after global matrix assembly >>>>>> and just before kspsolve. >>>>>> However there is error because of 'pcfactorgetmatrix' and >>>>>> 'matmumpsseticntl'. >>>>>> >>>>>> How can I fix this? >>>>>> >>>>> >>>>> Give the Icntrl as an option. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Thanks, >>>>>> Hyung Kim >>>>>> >>>>>> 2022? 11? 30? (?) ?? 10:44, Matthew Knepley ?? ??: >>>>>> >>>>>>> On Wed, Nov 30, 2022 at 8:40 AM ??? wrote: >>>>>>> >>>>>>>> Following your comments, >>>>>>>> >>>>>>>> After matrix assembly end, >>>>>>>> PetscCall(KSPGetPC(ksp,&pc)); >>>>>>>> PetscCall(KSPSetFromOptions(ksp)); >>>>>>>> PetscCall(KSPSetUp(ksp)); >>>>>>>> PetscCall(PCFactorGetMatrix(pc,&xGK)); >>>>>>>> >>>>>>>> However there is another error as below. >>>>>>>> [0]PETSC ERROR: Object is in wrong state >>>>>>>> [0]PETSC ERROR: Not for factored matrix >>>>>>>> >>>>>>> >>>>>>> The error message is telling you that you cannot alter values in the >>>>>>> factored matrix. This is because >>>>>>> the direct solvers use their own internal storage formats which we >>>>>>> cannot alter, and you should probably >>>>>>> not alter either. >>>>>>> >>>>>>> What are you trying to do? >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>>>>>> shooting. >>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>>>>> [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by >>>>>>>> ksi2443 Wed Nov 30 05:37:52 2022 >>>>>>>> [0]PETSC ERROR: Configure options -download-mumps >>>>>>>> -download-scalapack -download-parmetis -download-metis >>>>>>>> [0]PETSC ERROR: #1 MatZeroEntries() at >>>>>>>> /home/ksi2443/petsc/src/mat/interface/matrix.c:6024 >>>>>>>> [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:339 >>>>>>>> [0]PETSC ERROR: No PETSc Option Table entries >>>>>>>> >>>>>>>> How can I fix this? >>>>>>>> >>>>>>>> >>>>>>>> Thanks, >>>>>>>> Hyung Kim >>>>>>>> >>>>>>>> >>>>>>>> 2022? 11? 30? (?) ?? 4:18, Jose E. Roman ?? ??: >>>>>>>> >>>>>>>>> You have to call PCFactorGetMatrix() first. See any of the >>>>>>>>> examples that use MatMumpsSetIcntl(), for instance >>>>>>>>> https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html >>>>>>>>> >>>>>>>>> Jose >>>>>>>>> >>>>>>>>> >>>>>>>>> > El 30 nov 2022, a las 6:52, ??? escribi?: >>>>>>>>> > >>>>>>>>> > Hello, >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > I tried to adopt METIS option in MUMPS by using >>>>>>>>> > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' >>>>>>>>> > >>>>>>>>> > However, there is an error as follows >>>>>>>>> > >>>>>>>>> > [0]PETSC ERROR: Object is in wrong state >>>>>>>>> > [0]PETSC ERROR: Only for factored matrix >>>>>>>>> > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>>>>>>> shooting. >>>>>>>>> > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>>>>>> > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by >>>>>>>>> ksi2443 Tue Nov 29 21:12:41 2022 >>>>>>>>> > [0]PETSC ERROR: Configure options -download-mumps >>>>>>>>> -download-scalapack -download-parmetis -download-metis >>>>>>>>> > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at >>>>>>>>> /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 >>>>>>>>> > [0]PETSC ERROR: #2 main() at >>>>>>>>> /home/ksi2443/Downloads/coding/a1.c:149 >>>>>>>>> > [0]PETSC ERROR: No PETSc Option Table entries >>>>>>>>> > >>>>>>>>> > How can I fix this error? >>>>>>>>> > >>>>>>>>> > Thank you for your help. >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > Hyung Kim >>>>>>>>> >>>>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>> >>>>>>> >>>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Wed Nov 30 08:44:52 2022 From: mfadams at lbl.gov (Mark Adams) Date: Wed, 30 Nov 2022 09:44:52 -0500 Subject: [petsc-users] Question About Assembly matrix and declaration of KSP & pc In-Reply-To: References: Message-ID: On Wed, Nov 30, 2022 at 8:31 AM Matthew Knepley wrote: > On Wed, Nov 30, 2022 at 8:25 AM ??? wrote: > >> >> In your comments, >> KSPSetOperators is called if you want to change the system matrix. >> >> "change the system matrix" means the components of matrix are changed? >> I mean the values of some components of matrix are changed. >> > > If you just change values in the matrix, you do not have to call it again. > You do need to call KSPSetOperators before the solve if you want the KSP to redo the (PC) setup. In GAMG this is substantial. If the matrix does not change much you can try not doing this and experiment. > > Thanks, > > Matt > > >> Thanks, >> Hyung Kim >> >> >> 2022? 11? 30? (?) ?? 10:00, Matthew Knepley ?? ??: >> >>> On Wed, Nov 30, 2022 at 7:51 AM ??? wrote: >>> >>>> Thank you for your comments. >>>> However, I have more questions. >>>> >>>> 1. Generally, (KSPCreate, KSPSetOperators, KSPGetPC, PCSetType, >>>> PCFactorSetMatSolverType, KSPSetFromOptions ) >>>> above functions are should be called after each "MatassemblyEnd??" >>>> >>> >>> KSPCreate is called once. >>> >>> You do not need PCSetType, PCFactorSetMatSolverType, KSPSetFromOptions >>> more than once, unless you want to change the solver type. >>> >>> KSPSetOperators is called if you want to change the system matrix. >>> >>> KSPSolve is called when you want to change the rhs. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> 2. Though reading the user guide, I don't fully understand under what >>>> circumstances the functions mentioned above should be called again. Can you >>>> explain when each function should be called? >>>> >>>> Thanks, >>>> >>>> Hyung Kim >>>> >>>> 2022? 11? 30? (?) ?? 8:37, Mark Adams ?? ??: >>>> >>>>> >>>>> >>>>> On Wed, Nov 30, 2022 at 5:08 AM ??? wrote: >>>>> >>>>>> Hello, >>>>>> >>>>>> >>>>>> >>>>>> I?m working on FEM using PETSc. >>>>>> >>>>>> As everyone knows, it is necessary to repeatedly solve Ax=B. >>>>>> >>>>>> Regarding this, I have 4 questions. >>>>>> >>>>>> >>>>>> >>>>>> 1. There are many steps for preparing KSPSolve. For example >>>>>> KSPcreate, KSPSetOperators, KSPGetPC, PCSetType, PCFactorSetMatSolverType, >>>>>> KSPSetFromOptions? >>>>>> In Nonlinear FEM, there are repeatedly kspsolve for getting answer >>>>>> vector. Is it correct to do all of the aforementioned processes (KSPcreate, >>>>>> KSPSetOperators ~~~) for each KSPSolve? Or should I declare it only once at >>>>>> the beginning and not call it again? >>>>>> >>>>> >>>>> You just do these once at setup but for nonlinear problems >>>>> KSPSetOperators tells the solver that you have a new matrix and so "matrix >>>>> setup" work needs to be done. >>>>> >>>>> >>>>>> >>>>>> 2. If the answer to question 1 is that it must be repeated every >>>>>> time, should this work be done right before kspsolve, that is, when the >>>>>> global matrix assembly is finished, or is it irrelevant to performance at >>>>>> any time? >>>>>> >>>>> >>>>> KSPSetOperators should be set after the new matrix values are set but >>>>> it might work before. It just sets a pointer to the matrix and flags it as >>>>> not setup. >>>>> >>>>> >>>>>> >>>>>> >>>>>> 3. When performing FEM, local matrices are often scattered in >>>>>> global matrices depending on connectivity. In this case, which is better in >>>>>> terms of performance: adding the values one by one with MatSetValue or >>>>>> adding them all at once with MatSetValues even if they are scattered? >>>>>> >>>>> >>>>> You want to add one element matrix at a time, generally. >>>>> >>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> 4. I would like to measure the time of each section of the >>>>>> process. Which method is recommended? >>>>>> >>>>> >>>>> PETSc methods are timed separateluy, but setup gets folded into >>>>> KSPSolve unless you call SNESSetUp before the SNES[KSP]Solve. >>>>> You can add you own timers also >>>>> https://petsc.org/release/docs/manualpages/Profiling/PetscLogEventRegister/ >>>>> >>>>> Mark >>>>> >>>>> >>>>> >>>>>> >>>>>> >>>>>> Thank you for your help. >>>>>> >>>>>> >>>>>> >>>>>> Hyung Kim >>>>>> >>>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 30 08:54:00 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 30 Nov 2022 09:54:00 -0500 Subject: [petsc-users] About MatMumpsSetIcntl function In-Reply-To: References: <8AA60F74-2328-4948-9A62-77E94214293A@dsic.upv.es> Message-ID: On Wed, Nov 30, 2022 at 9:31 AM ??? wrote: > After folloing the comment, ./app -pc_type lu -ksp_type preonly > -ksp_monitor_true_residual -ksp_converged_reason -ksp_view > -mat_mumps_icntl_7 5 > Okay, you can see that it is using METIS: INFOG(7) (ordering option effectively used after analysis): 5 It looks like the server stuff was not seeing the option. Put it back in and send the output. Thanks, Matt The outputs are as below. > > 0 KSP none resid norm 2.000000000000e+00 true resid norm > 4.241815708566e-16 ||r(i)||/||b|| 2.120907854283e-16 > 1 KSP none resid norm 4.241815708566e-16 true resid norm > 4.241815708566e-16 ||r(i)||/||b|| 2.120907854283e-16 > Linear solve converged due to CONVERGED_ITS iterations 1 > KSP Object: 1 MPI process > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: 1 MPI process > type: lu > out-of-place factorization > tolerance for zero pivot 2.22045e-14 > matrix ordering: external > factor fill ratio given 0., needed 0. > Factored matrix follows: > Mat Object: 1 MPI process > type: mumps > rows=24, cols=24 > package used to perform factorization: mumps > total: nonzeros=576, allocated nonzeros=576 > MUMPS run parameters: > Use -ksp_view ::ascii_info_detail to display information for > all processes > RINFOG(1) (global estimated flops for the elimination after > analysis): 8924. > RINFOG(2) (global estimated flops for the assembly after > factorization): 0. > RINFOG(3) (global estimated flops for the elimination after > factorization): 8924. > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): > (0.,0.)*(2^0) > INFOG(3) (estimated real workspace for factors on all > processors after analysis): 576 > INFOG(4) (estimated integer workspace for factors on all > processors after analysis): 68 > INFOG(5) (estimated maximum front size in the complete > tree): 24 > INFOG(6) (number of nodes in the complete tree): 1 > INFOG(7) (ordering option effectively used after analysis): 5 > INFOG(8) (structural symmetry in percent of the permuted > matrix after analysis): 100 > INFOG(9) (total real/complex workspace to store the matrix > factors after factorization): 576 > INFOG(10) (total integer space store the matrix factors > after factorization): 68 > INFOG(11) (order of largest frontal matrix after > factorization): 24 > INFOG(12) (number of off-diagonal pivots): 0 > INFOG(13) (number of delayed pivots after factorization): 0 > INFOG(14) (number of memory compress after factorization): 0 > INFOG(15) (number of steps of iterative refinement after > solution): 0 > INFOG(16) (estimated size (in MB) of all MUMPS internal data > for factorization after analysis: value on the most memory consuming > processor): 0 > INFOG(17) (estimated size of all MUMPS internal data for > factorization after analysis: sum over all processors): 0 > INFOG(18) (size of all MUMPS internal data allocated during > factorization: value on the most memory consuming processor): 0 > INFOG(19) (size of all MUMPS internal data allocated during > factorization: sum over all processors): 0 > INFOG(20) (estimated number of entries in the factors): 576 > INFOG(21) (size in MB of memory effectively used during > factorization - value on the most memory consuming processor): 0 > INFOG(22) (size in MB of memory effectively used during > factorization - sum over all processors): 0 > INFOG(23) (after analysis: value of ICNTL(6) effectively > used): 0 > INFOG(24) (after analysis: value of ICNTL(12) effectively > used): 1 > INFOG(25) (after factorization: number of pivots modified by > static pivoting): 0 > INFOG(28) (after factorization: number of null pivots > encountered): 0 > INFOG(29) (after factorization: effective number of entries > in the factors (sum over all processors)): 576 > INFOG(30, 31) (after solution: size in Mbytes of memory used > during solution phase): 0, 0 > INFOG(32) (after analysis: type of analysis done): 1 > INFOG(33) (value used for ICNTL(8)): 7 > INFOG(34) (exponent of the determinant if determinant is > requested): 0 > INFOG(35) (after factorization: number of entries taking > into account BLR factor compression - sum over all processors): 576 > INFOG(36) (after analysis: estimated size of all MUMPS > internal data for running BLR in-core - value on the most memory consuming > processor): 0 > INFOG(37) (after analysis: estimated size of all MUMPS > internal data for running BLR in-core - sum over all processors): 0 > INFOG(38) (after analysis: estimated size of all MUMPS > internal data for running BLR out-of-core - value on the most memory > consuming processor): 0 > INFOG(39) (after analysis: estimated size of all MUMPS > internal data for running BLR out-of-core - sum over all processors): 0 > linear system matrix = precond matrix: > Mat Object: 1 MPI process > type: seqaij > rows=24, cols=24 > total: nonzeros=576, allocated nonzeros=840 > total number of mallocs used during MatSetValues calls=48 > using I-node routines: found 5 nodes, limit used is 5 > > > > 2022? 11? 30? (?) ?? 11:26, Matthew Knepley ?? ??: > >> On Wed, Nov 30, 2022 at 9:20 AM ??? wrote: >> >>> In my code there are below. >>> PetscCall(KSPCreate(PETSC_COMM_WORLD, &ksp)); >>> PetscCall(KSPSetOperators(ksp, xGK, xGK)); >>> PetscCall(KSPGetPC(ksp, &pc)); >>> PetscCall(PCSetType(pc, PCLU)); >>> PetscCall(PCFactorSetMatSolverType(pc, MATSOLVERMUMPS)); >>> PetscCall(KSPSetFromOptions(ksp)); >>> >>> and my runtime options are as below. >>> mpirun -np 3 ./app -mpi_linear_solver_server >>> -mpi_linear_solver_server_view -pc_type mpi -ksp_type preonly >>> -mpi_ksp_monitor -mpi_ksp_converged_reason -mpi_pc_type lu >>> -pc_mpi_always_use_server -mat_mumps_icntl_7 5 >>> >> >> 1) Get rid of the all server stuff until we see what is wrong with your >> code >> >> 2) Always run in serial until it works >> >> ./app -pc_type lu -ksp_type preonly -ksp_monitor_true_residual >> -ksp_converged_reason -ksp_view -mat_mumps_icntl_7 5 >> >> Send the output so we can see what the solver is. >> >> Thanks, >> >> Matt >> >> 2022? 11? 30? (?) ?? 11:16, Matthew Knepley ?? ??: >>> >>>> On Wed, Nov 30, 2022 at 9:10 AM ??? wrote: >>>> >>>>> When I adopt icntl by using option, the outputs are as below. >>>>> >>>>> WARNING! There are options you set that were not used! >>>>> WARNING! could be spelling mistake, etc! >>>>> There is one unused database option. It is: >>>>> Option left: name:-mat_mumps_icntl_7 value: 5 >>>>> >>>>> Is it work?? >>>>> >>>> >>>> Are you calling KSPSetFromOptions() after the PC is created? >>>> >>>> -pc_type lu -pc_factor_mat_solver_type mumps -mat_mumps_icntl_7 3 >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> Thanks, >>>>> Hyung Kim >>>>> >>>>> 2022? 11? 30? (?) ?? 11:04, Matthew Knepley ?? ??: >>>>> >>>>>> On Wed, Nov 30, 2022 at 8:58 AM ??? wrote: >>>>>> >>>>>>> I'm working on FEM. >>>>>>> When I used mumps alone, I fount it efficient to use mumps with >>>>>>> metis. >>>>>>> So my purpose is using MUMPSsolver with METIS. >>>>>>> >>>>>>> I tried to set metis (by icntl_7 : 5) after global matrix assembly >>>>>>> and just before kspsolve. >>>>>>> However there is error because of 'pcfactorgetmatrix' and >>>>>>> 'matmumpsseticntl'. >>>>>>> >>>>>>> How can I fix this? >>>>>>> >>>>>> >>>>>> Give the Icntrl as an option. >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> Thanks, >>>>>>> Hyung Kim >>>>>>> >>>>>>> 2022? 11? 30? (?) ?? 10:44, Matthew Knepley ?? >>>>>>> ??: >>>>>>> >>>>>>>> On Wed, Nov 30, 2022 at 8:40 AM ??? wrote: >>>>>>>> >>>>>>>>> Following your comments, >>>>>>>>> >>>>>>>>> After matrix assembly end, >>>>>>>>> PetscCall(KSPGetPC(ksp,&pc)); >>>>>>>>> PetscCall(KSPSetFromOptions(ksp)); >>>>>>>>> PetscCall(KSPSetUp(ksp)); >>>>>>>>> PetscCall(PCFactorGetMatrix(pc,&xGK)); >>>>>>>>> >>>>>>>>> However there is another error as below. >>>>>>>>> [0]PETSC ERROR: Object is in wrong state >>>>>>>>> [0]PETSC ERROR: Not for factored matrix >>>>>>>>> >>>>>>>> >>>>>>>> The error message is telling you that you cannot alter values in >>>>>>>> the factored matrix. This is because >>>>>>>> the direct solvers use their own internal storage formats which we >>>>>>>> cannot alter, and you should probably >>>>>>>> not alter either. >>>>>>>> >>>>>>>> What are you trying to do? >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>>>>>>> shooting. >>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>>>>>> [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by >>>>>>>>> ksi2443 Wed Nov 30 05:37:52 2022 >>>>>>>>> [0]PETSC ERROR: Configure options -download-mumps >>>>>>>>> -download-scalapack -download-parmetis -download-metis >>>>>>>>> [0]PETSC ERROR: #1 MatZeroEntries() at >>>>>>>>> /home/ksi2443/petsc/src/mat/interface/matrix.c:6024 >>>>>>>>> [0]PETSC ERROR: #2 main() at >>>>>>>>> /home/ksi2443/Downloads/coding/a1.c:339 >>>>>>>>> [0]PETSC ERROR: No PETSc Option Table entries >>>>>>>>> >>>>>>>>> How can I fix this? >>>>>>>>> >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> Hyung Kim >>>>>>>>> >>>>>>>>> >>>>>>>>> 2022? 11? 30? (?) ?? 4:18, Jose E. Roman ?? >>>>>>>>> ??: >>>>>>>>> >>>>>>>>>> You have to call PCFactorGetMatrix() first. See any of the >>>>>>>>>> examples that use MatMumpsSetIcntl(), for instance >>>>>>>>>> https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html >>>>>>>>>> >>>>>>>>>> Jose >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> > El 30 nov 2022, a las 6:52, ??? escribi?: >>>>>>>>>> > >>>>>>>>>> > Hello, >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > I tried to adopt METIS option in MUMPS by using >>>>>>>>>> > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' >>>>>>>>>> > >>>>>>>>>> > However, there is an error as follows >>>>>>>>>> > >>>>>>>>>> > [0]PETSC ERROR: Object is in wrong state >>>>>>>>>> > [0]PETSC ERROR: Only for factored matrix >>>>>>>>>> > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>>>>>>>> shooting. >>>>>>>>>> > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>>>>>>> > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by >>>>>>>>>> ksi2443 Tue Nov 29 21:12:41 2022 >>>>>>>>>> > [0]PETSC ERROR: Configure options -download-mumps >>>>>>>>>> -download-scalapack -download-parmetis -download-metis >>>>>>>>>> > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at >>>>>>>>>> /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 >>>>>>>>>> > [0]PETSC ERROR: #2 main() at >>>>>>>>>> /home/ksi2443/Downloads/coding/a1.c:149 >>>>>>>>>> > [0]PETSC ERROR: No PETSc Option Table entries >>>>>>>>>> > >>>>>>>>>> > How can I fix this error? >>>>>>>>>> > >>>>>>>>>> > Thank you for your help. >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > Hyung Kim >>>>>>>>>> >>>>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>> >>>>>> >>>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ksi2443 at gmail.com Wed Nov 30 09:01:30 2022 From: ksi2443 at gmail.com (=?UTF-8?B?6rmA7ISx7J21?=) Date: Thu, 1 Dec 2022 00:01:30 +0900 Subject: [petsc-users] About MatMumpsSetIcntl function In-Reply-To: References: <8AA60F74-2328-4948-9A62-77E94214293A@dsic.upv.es> Message-ID: Following your comments my options are as below. mpirun -np 3 ./app -mpi_linear_solver_server -mpi_linear_solver_server_view -pc_type mpi -ksp_type preonly -mpi_ksp_monitor -mpi_ksp_converged_reason -mpi_pc_type lu -pc_mpi_always_use_server -ksp_monitor_true_residual -ksp_converged_reason -ksp_view -mat_mumps_icntl_7 5 and the outputs are as below. Residual norms for mpi_ solve. 0 KSP Residual norm 3.941360585078e-05 1 KSP Residual norm 3.325311951792e-20 Linear mpi_ solve converged due to CONVERGED_RTOL iterations 1 0 KSP none resid norm 2.000000000000e+00 true resid norm 5.241047207555e-16 ||r(i)||/||b|| 2.620523603778e-16 1 KSP none resid norm 5.241047207555e-16 true resid norm 5.241047207555e-16 ||r(i)||/||b|| 2.620523603778e-16 KSP Object: 1 MPI process type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: 1 MPI process type: mpi Size of MPI communicator used for MPI parallel KSP solve 1 Desired minimum number of nonzeros per rank for MPI parallel solve 10000 *** Use -mpi_ksp_view to see the MPI KSP parameters *** *** Use -mpi_linear_solver_server_view to statistics on all the solves *** linear system matrix = precond matrix: Mat Object: 1 MPI process type: seqaij rows=24, cols=24 total: nonzeros=576, allocated nonzeros=840 total number of mallocs used during MatSetValues calls=48 using I-node routines: found 5 nodes, limit used is 5 Solving Time : 0.003143sec Assemble Time : 0.000994sec TIME : 1.000000, TIME_STEP : 1.000000, ITER : 2, RESIDUAL : 5.889432e-09 TIME0 : 1.000000 MPI linear solver server statistics: Ranks KSPSolve()s Mats KSPs Avg. Size Avg. Its 1 1 1 1 24 1 WARNING! There are options you set that were not used! WARNING! could be spelling mistake, etc! There is one unused database option. It is: Option left: name:-mat_mumps_icntl_7 value: 5 2022? 11? 30? (?) ?? 11:54, Matthew Knepley ?? ??: > On Wed, Nov 30, 2022 at 9:31 AM ??? wrote: > >> After folloing the comment, ./app -pc_type lu -ksp_type preonly >> -ksp_monitor_true_residual -ksp_converged_reason -ksp_view >> -mat_mumps_icntl_7 5 >> > > Okay, you can see that it is using METIS: > > INFOG(7) (ordering option effectively used after analysis): 5 > > It looks like the server stuff was not seeing the option. Put it back in > and send the output. > > Thanks, > > Matt > > The outputs are as below. >> >> 0 KSP none resid norm 2.000000000000e+00 true resid norm >> 4.241815708566e-16 ||r(i)||/||b|| 2.120907854283e-16 >> 1 KSP none resid norm 4.241815708566e-16 true resid norm >> 4.241815708566e-16 ||r(i)||/||b|| 2.120907854283e-16 >> Linear solve converged due to CONVERGED_ITS iterations 1 >> KSP Object: 1 MPI process >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: 1 MPI process >> type: lu >> out-of-place factorization >> tolerance for zero pivot 2.22045e-14 >> matrix ordering: external >> factor fill ratio given 0., needed 0. >> Factored matrix follows: >> Mat Object: 1 MPI process >> type: mumps >> rows=24, cols=24 >> package used to perform factorization: mumps >> total: nonzeros=576, allocated nonzeros=576 >> MUMPS run parameters: >> Use -ksp_view ::ascii_info_detail to display information >> for all processes >> RINFOG(1) (global estimated flops for the elimination after >> analysis): 8924. >> RINFOG(2) (global estimated flops for the assembly after >> factorization): 0. >> RINFOG(3) (global estimated flops for the elimination after >> factorization): 8924. >> (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): >> (0.,0.)*(2^0) >> INFOG(3) (estimated real workspace for factors on all >> processors after analysis): 576 >> INFOG(4) (estimated integer workspace for factors on all >> processors after analysis): 68 >> INFOG(5) (estimated maximum front size in the complete >> tree): 24 >> INFOG(6) (number of nodes in the complete tree): 1 >> INFOG(7) (ordering option effectively used after analysis): >> 5 >> INFOG(8) (structural symmetry in percent of the permuted >> matrix after analysis): 100 >> INFOG(9) (total real/complex workspace to store the matrix >> factors after factorization): 576 >> INFOG(10) (total integer space store the matrix factors >> after factorization): 68 >> INFOG(11) (order of largest frontal matrix after >> factorization): 24 >> INFOG(12) (number of off-diagonal pivots): 0 >> INFOG(13) (number of delayed pivots after factorization): 0 >> INFOG(14) (number of memory compress after factorization): 0 >> INFOG(15) (number of steps of iterative refinement after >> solution): 0 >> INFOG(16) (estimated size (in MB) of all MUMPS internal >> data for factorization after analysis: value on the most memory consuming >> processor): 0 >> INFOG(17) (estimated size of all MUMPS internal data for >> factorization after analysis: sum over all processors): 0 >> INFOG(18) (size of all MUMPS internal data allocated during >> factorization: value on the most memory consuming processor): 0 >> INFOG(19) (size of all MUMPS internal data allocated during >> factorization: sum over all processors): 0 >> INFOG(20) (estimated number of entries in the factors): 576 >> INFOG(21) (size in MB of memory effectively used during >> factorization - value on the most memory consuming processor): 0 >> INFOG(22) (size in MB of memory effectively used during >> factorization - sum over all processors): 0 >> INFOG(23) (after analysis: value of ICNTL(6) effectively >> used): 0 >> INFOG(24) (after analysis: value of ICNTL(12) effectively >> used): 1 >> INFOG(25) (after factorization: number of pivots modified >> by static pivoting): 0 >> INFOG(28) (after factorization: number of null pivots >> encountered): 0 >> INFOG(29) (after factorization: effective number of entries >> in the factors (sum over all processors)): 576 >> INFOG(30, 31) (after solution: size in Mbytes of memory >> used during solution phase): 0, 0 >> INFOG(32) (after analysis: type of analysis done): 1 >> INFOG(33) (value used for ICNTL(8)): 7 >> INFOG(34) (exponent of the determinant if determinant is >> requested): 0 >> INFOG(35) (after factorization: number of entries taking >> into account BLR factor compression - sum over all processors): 576 >> INFOG(36) (after analysis: estimated size of all MUMPS >> internal data for running BLR in-core - value on the most memory consuming >> processor): 0 >> INFOG(37) (after analysis: estimated size of all MUMPS >> internal data for running BLR in-core - sum over all processors): 0 >> INFOG(38) (after analysis: estimated size of all MUMPS >> internal data for running BLR out-of-core - value on the most memory >> consuming processor): 0 >> INFOG(39) (after analysis: estimated size of all MUMPS >> internal data for running BLR out-of-core - sum over all processors): 0 >> linear system matrix = precond matrix: >> Mat Object: 1 MPI process >> type: seqaij >> rows=24, cols=24 >> total: nonzeros=576, allocated nonzeros=840 >> total number of mallocs used during MatSetValues calls=48 >> using I-node routines: found 5 nodes, limit used is 5 >> >> >> >> 2022? 11? 30? (?) ?? 11:26, Matthew Knepley ?? ??: >> >>> On Wed, Nov 30, 2022 at 9:20 AM ??? wrote: >>> >>>> In my code there are below. >>>> PetscCall(KSPCreate(PETSC_COMM_WORLD, &ksp)); >>>> PetscCall(KSPSetOperators(ksp, xGK, xGK)); >>>> PetscCall(KSPGetPC(ksp, &pc)); >>>> PetscCall(PCSetType(pc, PCLU)); >>>> PetscCall(PCFactorSetMatSolverType(pc, MATSOLVERMUMPS)); >>>> PetscCall(KSPSetFromOptions(ksp)); >>>> >>>> and my runtime options are as below. >>>> mpirun -np 3 ./app -mpi_linear_solver_server >>>> -mpi_linear_solver_server_view -pc_type mpi -ksp_type preonly >>>> -mpi_ksp_monitor -mpi_ksp_converged_reason -mpi_pc_type lu >>>> -pc_mpi_always_use_server -mat_mumps_icntl_7 5 >>>> >>> >>> 1) Get rid of the all server stuff until we see what is wrong with your >>> code >>> >>> 2) Always run in serial until it works >>> >>> ./app -pc_type lu -ksp_type preonly -ksp_monitor_true_residual >>> -ksp_converged_reason -ksp_view -mat_mumps_icntl_7 5 >>> >>> Send the output so we can see what the solver is. >>> >>> Thanks, >>> >>> Matt >>> >>> 2022? 11? 30? (?) ?? 11:16, Matthew Knepley ?? ??: >>>> >>>>> On Wed, Nov 30, 2022 at 9:10 AM ??? wrote: >>>>> >>>>>> When I adopt icntl by using option, the outputs are as below. >>>>>> >>>>>> WARNING! There are options you set that were not used! >>>>>> WARNING! could be spelling mistake, etc! >>>>>> There is one unused database option. It is: >>>>>> Option left: name:-mat_mumps_icntl_7 value: 5 >>>>>> >>>>>> Is it work?? >>>>>> >>>>> >>>>> Are you calling KSPSetFromOptions() after the PC is created? >>>>> >>>>> -pc_type lu -pc_factor_mat_solver_type mumps -mat_mumps_icntl_7 3 >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Thanks, >>>>>> Hyung Kim >>>>>> >>>>>> 2022? 11? 30? (?) ?? 11:04, Matthew Knepley ?? ??: >>>>>> >>>>>>> On Wed, Nov 30, 2022 at 8:58 AM ??? wrote: >>>>>>> >>>>>>>> I'm working on FEM. >>>>>>>> When I used mumps alone, I fount it efficient to use mumps with >>>>>>>> metis. >>>>>>>> So my purpose is using MUMPSsolver with METIS. >>>>>>>> >>>>>>>> I tried to set metis (by icntl_7 : 5) after global matrix assembly >>>>>>>> and just before kspsolve. >>>>>>>> However there is error because of 'pcfactorgetmatrix' and >>>>>>>> 'matmumpsseticntl'. >>>>>>>> >>>>>>>> How can I fix this? >>>>>>>> >>>>>>> >>>>>>> Give the Icntrl as an option. >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> Thanks, >>>>>>>> Hyung Kim >>>>>>>> >>>>>>>> 2022? 11? 30? (?) ?? 10:44, Matthew Knepley ?? >>>>>>>> ??: >>>>>>>> >>>>>>>>> On Wed, Nov 30, 2022 at 8:40 AM ??? wrote: >>>>>>>>> >>>>>>>>>> Following your comments, >>>>>>>>>> >>>>>>>>>> After matrix assembly end, >>>>>>>>>> PetscCall(KSPGetPC(ksp,&pc)); >>>>>>>>>> PetscCall(KSPSetFromOptions(ksp)); >>>>>>>>>> PetscCall(KSPSetUp(ksp)); >>>>>>>>>> PetscCall(PCFactorGetMatrix(pc,&xGK)); >>>>>>>>>> >>>>>>>>>> However there is another error as below. >>>>>>>>>> [0]PETSC ERROR: Object is in wrong state >>>>>>>>>> [0]PETSC ERROR: Not for factored matrix >>>>>>>>>> >>>>>>>>> >>>>>>>>> The error message is telling you that you cannot alter values in >>>>>>>>> the factored matrix. This is because >>>>>>>>> the direct solvers use their own internal storage formats which we >>>>>>>>> cannot alter, and you should probably >>>>>>>>> not alter either. >>>>>>>>> >>>>>>>>> What are you trying to do? >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>>>>>>>> shooting. >>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>>>>>>> [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by >>>>>>>>>> ksi2443 Wed Nov 30 05:37:52 2022 >>>>>>>>>> [0]PETSC ERROR: Configure options -download-mumps >>>>>>>>>> -download-scalapack -download-parmetis -download-metis >>>>>>>>>> [0]PETSC ERROR: #1 MatZeroEntries() at >>>>>>>>>> /home/ksi2443/petsc/src/mat/interface/matrix.c:6024 >>>>>>>>>> [0]PETSC ERROR: #2 main() at >>>>>>>>>> /home/ksi2443/Downloads/coding/a1.c:339 >>>>>>>>>> [0]PETSC ERROR: No PETSc Option Table entries >>>>>>>>>> >>>>>>>>>> How can I fix this? >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> Hyung Kim >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> 2022? 11? 30? (?) ?? 4:18, Jose E. Roman ?? >>>>>>>>>> ??: >>>>>>>>>> >>>>>>>>>>> You have to call PCFactorGetMatrix() first. See any of the >>>>>>>>>>> examples that use MatMumpsSetIcntl(), for instance >>>>>>>>>>> https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html >>>>>>>>>>> >>>>>>>>>>> Jose >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> > El 30 nov 2022, a las 6:52, ??? escribi?: >>>>>>>>>>> > >>>>>>>>>>> > Hello, >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > I tried to adopt METIS option in MUMPS by using >>>>>>>>>>> > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' >>>>>>>>>>> > >>>>>>>>>>> > However, there is an error as follows >>>>>>>>>>> > >>>>>>>>>>> > [0]PETSC ERROR: Object is in wrong state >>>>>>>>>>> > [0]PETSC ERROR: Only for factored matrix >>>>>>>>>>> > [0]PETSC ERROR: See https://petsc.org/release/faq/ for >>>>>>>>>>> trouble shooting. >>>>>>>>>>> > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>>>>>>>> > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by >>>>>>>>>>> ksi2443 Tue Nov 29 21:12:41 2022 >>>>>>>>>>> > [0]PETSC ERROR: Configure options -download-mumps >>>>>>>>>>> -download-scalapack -download-parmetis -download-metis >>>>>>>>>>> > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at >>>>>>>>>>> /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 >>>>>>>>>>> > [0]PETSC ERROR: #2 main() at >>>>>>>>>>> /home/ksi2443/Downloads/coding/a1.c:149 >>>>>>>>>>> > [0]PETSC ERROR: No PETSc Option Table entries >>>>>>>>>>> > >>>>>>>>>>> > How can I fix this error? >>>>>>>>>>> > >>>>>>>>>>> > Thank you for your help. >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > Hyung Kim >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>> >>>>>>> >>>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre at joliv.et Wed Nov 30 09:02:50 2022 From: pierre at joliv.et (Pierre Jolivet) Date: Wed, 30 Nov 2022 16:02:50 +0100 Subject: [petsc-users] About MatMumpsSetIcntl function In-Reply-To: References: <8AA60F74-2328-4948-9A62-77E94214293A@dsic.upv.es> Message-ID: > On 30 Nov 2022, at 3:54 PM, Matthew Knepley wrote: > > On Wed, Nov 30, 2022 at 9:31 AM ??? > wrote: >> After folloing the comment, ./app -pc_type lu -ksp_type preonly -ksp_monitor_true_residual -ksp_converged_reason -ksp_view -mat_mumps_icntl_7 5 > > Okay, you can see that it is using METIS: > > INFOG(7) (ordering option effectively used after analysis): 5 > > It looks like the server stuff was not seeing the option. Put it back in and send the output. With a small twist, the option should now read -mpi_mat_mumps_icntl_7 5, cf. https://petsc.org/release/src/ksp/pc/impls/mpi/pcmpi.c.html#line126 Thanks, Pierre > Thanks, > > Matt > >> The outputs are as below. >> >> 0 KSP none resid norm 2.000000000000e+00 true resid norm 4.241815708566e-16 ||r(i)||/||b|| 2.120907854283e-16 >> 1 KSP none resid norm 4.241815708566e-16 true resid norm 4.241815708566e-16 ||r(i)||/||b|| 2.120907854283e-16 >> Linear solve converged due to CONVERGED_ITS iterations 1 >> KSP Object: 1 MPI process >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: 1 MPI process >> type: lu >> out-of-place factorization >> tolerance for zero pivot 2.22045e-14 >> matrix ordering: external >> factor fill ratio given 0., needed 0. >> Factored matrix follows: >> Mat Object: 1 MPI process >> type: mumps >> rows=24, cols=24 >> package used to perform factorization: mumps >> total: nonzeros=576, allocated nonzeros=576 >> MUMPS run parameters: >> Use -ksp_view ::ascii_info_detail to display information for all processes >> RINFOG(1) (global estimated flops for the elimination after analysis): 8924. >> RINFOG(2) (global estimated flops for the assembly after factorization): 0. >> RINFOG(3) (global estimated flops for the elimination after factorization): 8924. >> (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) >> INFOG(3) (estimated real workspace for factors on all processors after analysis): 576 >> INFOG(4) (estimated integer workspace for factors on all processors after analysis): 68 >> INFOG(5) (estimated maximum front size in the complete tree): 24 >> INFOG(6) (number of nodes in the complete tree): 1 >> INFOG(7) (ordering option effectively used after analysis): 5 >> INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 >> INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 576 >> INFOG(10) (total integer space store the matrix factors after factorization): 68 >> INFOG(11) (order of largest frontal matrix after factorization): 24 >> INFOG(12) (number of off-diagonal pivots): 0 >> INFOG(13) (number of delayed pivots after factorization): 0 >> INFOG(14) (number of memory compress after factorization): 0 >> INFOG(15) (number of steps of iterative refinement after solution): 0 >> INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 0 >> INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 0 >> INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 0 >> INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 0 >> INFOG(20) (estimated number of entries in the factors): 576 >> INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 0 >> INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 0 >> INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 >> INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 >> INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 >> INFOG(28) (after factorization: number of null pivots encountered): 0 >> INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 576 >> INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 >> INFOG(32) (after analysis: type of analysis done): 1 >> INFOG(33) (value used for ICNTL(8)): 7 >> INFOG(34) (exponent of the determinant if determinant is requested): 0 >> INFOG(35) (after factorization: number of entries taking into account BLR factor compression - sum over all processors): 576 >> INFOG(36) (after analysis: estimated size of all MUMPS internal data for running BLR in-core - value on the most memory consuming processor): 0 >> INFOG(37) (after analysis: estimated size of all MUMPS internal data for running BLR in-core - sum over all processors): 0 >> INFOG(38) (after analysis: estimated size of all MUMPS internal data for running BLR out-of-core - value on the most memory consuming processor): 0 >> INFOG(39) (after analysis: estimated size of all MUMPS internal data for running BLR out-of-core - sum over all processors): 0 >> linear system matrix = precond matrix: >> Mat Object: 1 MPI process >> type: seqaij >> rows=24, cols=24 >> total: nonzeros=576, allocated nonzeros=840 >> total number of mallocs used during MatSetValues calls=48 >> using I-node routines: found 5 nodes, limit used is 5 >> >> >> >> 2022? 11? 30? (?) ?? 11:26, Matthew Knepley >?? ??: >>> On Wed, Nov 30, 2022 at 9:20 AM ??? > wrote: >>>> In my code there are below. >>>> PetscCall(KSPCreate(PETSC_COMM_WORLD, &ksp)); >>>> PetscCall(KSPSetOperators(ksp, xGK, xGK)); >>>> PetscCall(KSPGetPC(ksp, &pc)); >>>> PetscCall(PCSetType(pc, PCLU)); >>>> PetscCall(PCFactorSetMatSolverType(pc, MATSOLVERMUMPS)); >>>> PetscCall(KSPSetFromOptions(ksp)); >>>> >>>> and my runtime options are as below. >>>> mpirun -np 3 ./app -mpi_linear_solver_server -mpi_linear_solver_server_view -pc_type mpi -ksp_type preonly -mpi_ksp_monitor -mpi_ksp_converged_reason -mpi_pc_type lu -pc_mpi_always_use_server -mat_mumps_icntl_7 5 >>> >>> 1) Get rid of the all server stuff until we see what is wrong with your code >>> >>> 2) Always run in serial until it works >>> >>> ./app -pc_type lu -ksp_type preonly -ksp_monitor_true_residual -ksp_converged_reason -ksp_view -mat_mumps_icntl_7 5 >>> >>> Send the output so we can see what the solver is. >>> >>> Thanks, >>> >>> Matt >>> >>>> 2022? 11? 30? (?) ?? 11:16, Matthew Knepley >?? ??: >>>>> On Wed, Nov 30, 2022 at 9:10 AM ??? > wrote: >>>>>> When I adopt icntl by using option, the outputs are as below. >>>>>> >>>>>> WARNING! There are options you set that were not used! >>>>>> WARNING! could be spelling mistake, etc! >>>>>> There is one unused database option. It is: >>>>>> Option left: name:-mat_mumps_icntl_7 value: 5 >>>>>> >>>>>> Is it work?? >>>>> >>>>> Are you calling KSPSetFromOptions() after the PC is created? >>>>> >>>>> -pc_type lu -pc_factor_mat_solver_type mumps -mat_mumps_icntl_7 3 >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>>> Thanks, >>>>>> Hyung Kim >>>>>> >>>>>> 2022? 11? 30? (?) ?? 11:04, Matthew Knepley >?? ??: >>>>>>> On Wed, Nov 30, 2022 at 8:58 AM ??? > wrote: >>>>>>>> I'm working on FEM. >>>>>>>> When I used mumps alone, I fount it efficient to use mumps with metis. >>>>>>>> So my purpose is using MUMPSsolver with METIS. >>>>>>>> >>>>>>>> I tried to set metis (by icntl_7 : 5) after global matrix assembly and just before kspsolve. >>>>>>>> However there is error because of 'pcfactorgetmatrix' and 'matmumpsseticntl'. >>>>>>>> >>>>>>>> How can I fix this? >>>>>>> >>>>>>> Give the Icntrl as an option. >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>>> Thanks, >>>>>>>> Hyung Kim >>>>>>>> >>>>>>>> 2022? 11? 30? (?) ?? 10:44, Matthew Knepley >?? ??: >>>>>>>>> On Wed, Nov 30, 2022 at 8:40 AM ??? > wrote: >>>>>>>>>> Following your comments, >>>>>>>>>> >>>>>>>>>> After matrix assembly end, >>>>>>>>>> PetscCall(KSPGetPC(ksp,&pc)); >>>>>>>>>> PetscCall(KSPSetFromOptions(ksp)); >>>>>>>>>> PetscCall(KSPSetUp(ksp)); >>>>>>>>>> PetscCall(PCFactorGetMatrix(pc,&xGK)); >>>>>>>>>> >>>>>>>>>> However there is another error as below. >>>>>>>>>> [0]PETSC ERROR: Object is in wrong state >>>>>>>>>> [0]PETSC ERROR: Not for factored matrix >>>>>>>>> >>>>>>>>> The error message is telling you that you cannot alter values in the factored matrix. This is because >>>>>>>>> the direct solvers use their own internal storage formats which we cannot alter, and you should probably >>>>>>>>> not alter either. >>>>>>>>> >>>>>>>>> What are you trying to do? >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. >>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>>>>>>> [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 Wed Nov 30 05:37:52 2022 >>>>>>>>>> [0]PETSC ERROR: Configure options -download-mumps -download-scalapack -download-parmetis -download-metis >>>>>>>>>> [0]PETSC ERROR: #1 MatZeroEntries() at /home/ksi2443/petsc/src/mat/interface/matrix.c:6024 >>>>>>>>>> [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:339 >>>>>>>>>> [0]PETSC ERROR: No PETSc Option Table entries >>>>>>>>>> >>>>>>>>>> How can I fix this? >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> Hyung Kim >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> 2022? 11? 30? (?) ?? 4:18, Jose E. Roman >?? ??: >>>>>>>>>>> You have to call PCFactorGetMatrix() first. See any of the examples that use MatMumpsSetIcntl(), for instance https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html >>>>>>>>>>> >>>>>>>>>>> Jose >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> > El 30 nov 2022, a las 6:52, ??? > escribi?: >>>>>>>>>>> > >>>>>>>>>>> > Hello, >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > I tried to adopt METIS option in MUMPS by using >>>>>>>>>>> > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' >>>>>>>>>>> > >>>>>>>>>>> > However, there is an error as follows >>>>>>>>>>> > >>>>>>>>>>> > [0]PETSC ERROR: Object is in wrong state >>>>>>>>>>> > [0]PETSC ERROR: Only for factored matrix >>>>>>>>>>> > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. >>>>>>>>>>> > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>>>>>>>> > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 Tue Nov 29 21:12:41 2022 >>>>>>>>>>> > [0]PETSC ERROR: Configure options -download-mumps -download-scalapack -download-parmetis -download-metis >>>>>>>>>>> > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 >>>>>>>>>>> > [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:149 >>>>>>>>>>> > [0]PETSC ERROR: No PETSc Option Table entries >>>>>>>>>>> > >>>>>>>>>>> > How can I fix this error? >>>>>>>>>>> > >>>>>>>>>>> > Thank you for your help. >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > Hyung Kim >>>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >>> -- >>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Nov 30 09:12:05 2022 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 30 Nov 2022 10:12:05 -0500 Subject: [petsc-users] About MatMumpsSetIcntl function In-Reply-To: References: <8AA60F74-2328-4948-9A62-77E94214293A@dsic.upv.es> Message-ID: On Wed, Nov 30, 2022 at 10:03 AM Pierre Jolivet wrote: > > > On 30 Nov 2022, at 3:54 PM, Matthew Knepley wrote: > > On Wed, Nov 30, 2022 at 9:31 AM ??? wrote: > >> After folloing the comment, ./app -pc_type lu -ksp_type preonly >> -ksp_monitor_true_residual -ksp_converged_reason -ksp_view >> -mat_mumps_icntl_7 5 >> > > Okay, you can see that it is using METIS: > > INFOG(7) (ordering option effectively used after analysis): 5 > > It looks like the server stuff was not seeing the option. Put it back in > and send the output. > > > With a small twist, the option should now read -mpi_mat_mumps_icntl_7 5, > cf. https://petsc.org/release/src/ksp/pc/impls/mpi/pcmpi.c.html#line126 > And we need -mpi_ksp_view to see the solver. Thanks, Matt > Thanks, > Pierre > > Thanks, > > Matt > > The outputs are as below. >> >> 0 KSP none resid norm 2.000000000000e+00 true resid norm >> 4.241815708566e-16 ||r(i)||/||b|| 2.120907854283e-16 >> 1 KSP none resid norm 4.241815708566e-16 true resid norm >> 4.241815708566e-16 ||r(i)||/||b|| 2.120907854283e-16 >> Linear solve converged due to CONVERGED_ITS iterations 1 >> KSP Object: 1 MPI process >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: 1 MPI process >> type: lu >> out-of-place factorization >> tolerance for zero pivot 2.22045e-14 >> matrix ordering: external >> factor fill ratio given 0., needed 0. >> Factored matrix follows: >> Mat Object: 1 MPI process >> type: mumps >> rows=24, cols=24 >> package used to perform factorization: mumps >> total: nonzeros=576, allocated nonzeros=576 >> MUMPS run parameters: >> Use -ksp_view ::ascii_info_detail to display information >> for all processes >> RINFOG(1) (global estimated flops for the elimination >> after analysis): 8924. >> RINFOG(2) (global estimated flops for the assembly after >> factorization): 0. >> RINFOG(3) (global estimated flops for the elimination >> after factorization): 8924. >> (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): >> (0.,0.)*(2^0) >> INFOG(3) (estimated real workspace for factors on all >> processors after analysis): 576 >> INFOG(4) (estimated integer workspace for factors on all >> processors after analysis): 68 >> INFOG(5) (estimated maximum front size in the complete >> tree): 24 >> INFOG(6) (number of nodes in the complete tree): 1 >> INFOG(7) (ordering option effectively used after >> analysis): 5 >> INFOG(8) (structural symmetry in percent of the permuted >> matrix after analysis): 100 >> INFOG(9) (total real/complex workspace to store the matrix >> factors after factorization): 576 >> INFOG(10) (total integer space store the matrix factors >> after factorization): 68 >> INFOG(11) (order of largest frontal matrix after >> factorization): 24 >> INFOG(12) (number of off-diagonal pivots): 0 >> INFOG(13) (number of delayed pivots after factorization): 0 >> INFOG(14) (number of memory compress after factorization): >> 0 >> INFOG(15) (number of steps of iterative refinement after >> solution): 0 >> INFOG(16) (estimated size (in MB) of all MUMPS internal >> data for factorization after analysis: value on the most memory consuming >> processor): 0 >> INFOG(17) (estimated size of all MUMPS internal data for >> factorization after analysis: sum over all processors): 0 >> INFOG(18) (size of all MUMPS internal data allocated >> during factorization: value on the most memory consuming processor): 0 >> INFOG(19) (size of all MUMPS internal data allocated >> during factorization: sum over all processors): 0 >> INFOG(20) (estimated number of entries in the factors): 576 >> INFOG(21) (size in MB of memory effectively used during >> factorization - value on the most memory consuming processor): 0 >> INFOG(22) (size in MB of memory effectively used during >> factorization - sum over all processors): 0 >> INFOG(23) (after analysis: value of ICNTL(6) effectively >> used): 0 >> INFOG(24) (after analysis: value of ICNTL(12) effectively >> used): 1 >> INFOG(25) (after factorization: number of pivots modified >> by static pivoting): 0 >> INFOG(28) (after factorization: number of null pivots >> encountered): 0 >> INFOG(29) (after factorization: effective number of >> entries in the factors (sum over all processors)): 576 >> INFOG(30, 31) (after solution: size in Mbytes of memory >> used during solution phase): 0, 0 >> INFOG(32) (after analysis: type of analysis done): 1 >> INFOG(33) (value used for ICNTL(8)): 7 >> INFOG(34) (exponent of the determinant if determinant is >> requested): 0 >> INFOG(35) (after factorization: number of entries taking >> into account BLR factor compression - sum over all processors): 576 >> INFOG(36) (after analysis: estimated size of all MUMPS >> internal data for running BLR in-core - value on the most memory consuming >> processor): 0 >> INFOG(37) (after analysis: estimated size of all MUMPS >> internal data for running BLR in-core - sum over all processors): 0 >> INFOG(38) (after analysis: estimated size of all MUMPS >> internal data for running BLR out-of-core - value on the most memory >> consuming processor): 0 >> INFOG(39) (after analysis: estimated size of all MUMPS >> internal data for running BLR out-of-core - sum over all processors): 0 >> linear system matrix = precond matrix: >> Mat Object: 1 MPI process >> type: seqaij >> rows=24, cols=24 >> total: nonzeros=576, allocated nonzeros=840 >> total number of mallocs used during MatSetValues calls=48 >> using I-node routines: found 5 nodes, limit used is 5 >> >> >> >> 2022? 11? 30? (?) ?? 11:26, Matthew Knepley ?? ??: >> >>> On Wed, Nov 30, 2022 at 9:20 AM ??? wrote: >>> >>>> In my code there are below. >>>> PetscCall(KSPCreate(PETSC_COMM_WORLD, &ksp)); >>>> PetscCall(KSPSetOperators(ksp, xGK, xGK)); >>>> PetscCall(KSPGetPC(ksp, &pc)); >>>> PetscCall(PCSetType(pc, PCLU)); >>>> PetscCall(PCFactorSetMatSolverType(pc, MATSOLVERMUMPS)); >>>> PetscCall(KSPSetFromOptions(ksp)); >>>> >>>> and my runtime options are as below. >>>> mpirun -np 3 ./app -mpi_linear_solver_server >>>> -mpi_linear_solver_server_view -pc_type mpi -ksp_type preonly >>>> -mpi_ksp_monitor -mpi_ksp_converged_reason -mpi_pc_type lu >>>> -pc_mpi_always_use_server -mat_mumps_icntl_7 5 >>>> >>> >>> 1) Get rid of the all server stuff until we see what is wrong with your >>> code >>> >>> 2) Always run in serial until it works >>> >>> ./app -pc_type lu -ksp_type preonly -ksp_monitor_true_residual >>> -ksp_converged_reason -ksp_view -mat_mumps_icntl_7 5 >>> >>> Send the output so we can see what the solver is. >>> >>> Thanks, >>> >>> Matt >>> >>> 2022? 11? 30? (?) ?? 11:16, Matthew Knepley ?? ??: >>>> >>>>> On Wed, Nov 30, 2022 at 9:10 AM ??? wrote: >>>>> >>>>>> When I adopt icntl by using option, the outputs are as below. >>>>>> >>>>>> WARNING! There are options you set that were not used! >>>>>> WARNING! could be spelling mistake, etc! >>>>>> There is one unused database option. It is: >>>>>> Option left: name:-mat_mumps_icntl_7 value: 5 >>>>>> >>>>>> Is it work?? >>>>>> >>>>> >>>>> Are you calling KSPSetFromOptions() after the PC is created? >>>>> >>>>> -pc_type lu -pc_factor_mat_solver_type mumps -mat_mumps_icntl_7 3 >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Thanks, >>>>>> Hyung Kim >>>>>> >>>>>> 2022? 11? 30? (?) ?? 11:04, Matthew Knepley ?? ??: >>>>>> >>>>>>> On Wed, Nov 30, 2022 at 8:58 AM ??? wrote: >>>>>>> >>>>>>>> I'm working on FEM. >>>>>>>> When I used mumps alone, I fount it efficient to use mumps with >>>>>>>> metis. >>>>>>>> So my purpose is using MUMPSsolver with METIS. >>>>>>>> >>>>>>>> I tried to set metis (by icntl_7 : 5) after global matrix assembly >>>>>>>> and just before kspsolve. >>>>>>>> However there is error because of 'pcfactorgetmatrix' and >>>>>>>> 'matmumpsseticntl'. >>>>>>>> >>>>>>>> How can I fix this? >>>>>>>> >>>>>>> >>>>>>> Give the Icntrl as an option. >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> Thanks, >>>>>>>> Hyung Kim >>>>>>>> >>>>>>>> 2022? 11? 30? (?) ?? 10:44, Matthew Knepley ?? >>>>>>>> ??: >>>>>>>> >>>>>>>>> On Wed, Nov 30, 2022 at 8:40 AM ??? wrote: >>>>>>>>> >>>>>>>>>> Following your comments, >>>>>>>>>> >>>>>>>>>> After matrix assembly end, >>>>>>>>>> PetscCall(KSPGetPC(ksp,&pc)); >>>>>>>>>> PetscCall(KSPSetFromOptions(ksp)); >>>>>>>>>> PetscCall(KSPSetUp(ksp)); >>>>>>>>>> PetscCall(PCFactorGetMatrix(pc,&xGK)); >>>>>>>>>> >>>>>>>>>> However there is another error as below. >>>>>>>>>> [0]PETSC ERROR: Object is in wrong state >>>>>>>>>> [0]PETSC ERROR: Not for factored matrix >>>>>>>>>> >>>>>>>>> >>>>>>>>> The error message is telling you that you cannot alter values in >>>>>>>>> the factored matrix. This is because >>>>>>>>> the direct solvers use their own internal storage formats which we >>>>>>>>> cannot alter, and you should probably >>>>>>>>> not alter either. >>>>>>>>> >>>>>>>>> What are you trying to do? >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble >>>>>>>>>> shooting. >>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>>>>>>> [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by >>>>>>>>>> ksi2443 Wed Nov 30 05:37:52 2022 >>>>>>>>>> [0]PETSC ERROR: Configure options -download-mumps >>>>>>>>>> -download-scalapack -download-parmetis -download-metis >>>>>>>>>> [0]PETSC ERROR: #1 MatZeroEntries() at >>>>>>>>>> /home/ksi2443/petsc/src/mat/interface/matrix.c:6024 >>>>>>>>>> [0]PETSC ERROR: #2 main() at >>>>>>>>>> /home/ksi2443/Downloads/coding/a1.c:339 >>>>>>>>>> [0]PETSC ERROR: No PETSc Option Table entries >>>>>>>>>> >>>>>>>>>> How can I fix this? >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> Hyung Kim >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> 2022? 11? 30? (?) ?? 4:18, Jose E. Roman ?? >>>>>>>>>> ??: >>>>>>>>>> >>>>>>>>>>> You have to call PCFactorGetMatrix() first. See any of the >>>>>>>>>>> examples that use MatMumpsSetIcntl(), for instance >>>>>>>>>>> https://petsc.org/release/src/ksp/ksp/tutorials/ex52.c.html >>>>>>>>>>> >>>>>>>>>>> Jose >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> > El 30 nov 2022, a las 6:52, ??? escribi?: >>>>>>>>>>> > >>>>>>>>>>> > Hello, >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > I tried to adopt METIS option in MUMPS by using >>>>>>>>>>> > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' >>>>>>>>>>> > >>>>>>>>>>> > However, there is an error as follows >>>>>>>>>>> > >>>>>>>>>>> > [0]PETSC ERROR: Object is in wrong state >>>>>>>>>>> > [0]PETSC ERROR: Only for factored matrix >>>>>>>>>>> > [0]PETSC ERROR: See https://petsc.org/release/faq/ for >>>>>>>>>>> trouble shooting. >>>>>>>>>>> > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown >>>>>>>>>>> > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by >>>>>>>>>>> ksi2443 Tue Nov 29 21:12:41 2022 >>>>>>>>>>> > [0]PETSC ERROR: Configure options -download-mumps >>>>>>>>>>> -download-scalapack -download-parmetis -download-metis >>>>>>>>>>> > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at >>>>>>>>>>> /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 >>>>>>>>>>> > [0]PETSC ERROR: #2 main() at >>>>>>>>>>> /home/ksi2443/Downloads/coding/a1.c:149 >>>>>>>>>>> > [0]PETSC ERROR: No PETSc Option Table entries >>>>>>>>>>> > >>>>>>>>>>> > How can I fix this error? >>>>>>>>>>> > >>>>>>>>>>> > Thank you for your help. >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > Hyung Kim >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>>> >>>>>>> >>>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Wed Nov 30 10:46:23 2022 From: bsmith at petsc.dev (Barry Smith) Date: Wed, 30 Nov 2022 11:46:23 -0500 Subject: [petsc-users] About MatMumpsSetIcntl function In-Reply-To: References: Message-ID: Note you can use -help to have the running code print all possible options it can currently handle. This produces a lot of output so generally, one can do things like ./code various options -help | grep mumps to see what the exact option is named for mumps in your situation. Also if you had run with -mpi_ksp_view as suggested in your previous output you sent *** Use -mpi_ksp_view to see the MPI KSP parameters *** it would show the options prefix for the mumps options (which in this case is -mpi_) so you would know the option should be given as -mpi_mat_mumps_icntl_7 5, So, in summary, since nested solvers can have complicated nesting and options names, one can determine the exact names without guessing by either 1) running with -help and searching for the appropriate full option name or 2) running with the appropriate view command and locating the options prefix for particular solve you want to control. We do this all the time, even though we wrote the code we cannot fully remember the options and figure out in our heads the full naming of nested solver options so we use 1) and 2) to determine them. Barry > On Nov 30, 2022, at 12:52 AM, ??? wrote: > > Hello, > > > I tried to adopt METIS option in MUMPS by using > ' PetscCall(MatMumpsSetIcntl(Mat, 7, 5));' > > However, there is an error as follows > > [0]PETSC ERROR: Object is in wrong state > [0]PETSC ERROR: Only for factored matrix > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 Tue Nov 29 21:12:41 2022 > [0]PETSC ERROR: Configure options -download-mumps -download-scalapack -download-parmetis -download-metis > [0]PETSC ERROR: #1 MatMumpsSetIcntl() at /home/ksi2443/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2478 > [0]PETSC ERROR: #2 main() at /home/ksi2443/Downloads/coding/a1.c:149 > [0]PETSC ERROR: No PETSc Option Table entries > > How can I fix this error? > > Thank you for your help. > > > Hyung Kim -------------- next part -------------- An HTML attachment was scrubbed... URL: