February 2017 Archives by thread
Starting: Wed Feb 1 09:21:46 CST 2017
Ending: Tue Feb 28 19:45:18 CST 2017
Messages: 287
- [petsc-users] Best workflow for different systems with different block sizes
Paolo Lampitella
- [petsc-users] Usage of VecCreateMPIWithArray in fortran
Praveen C
- [petsc-users] ASM with matrix-free method
Sonia Pozzi
- [petsc-users] Segmentation violation issue when using MatSetVals in Fortran
Austin Herrema
- [petsc-users] projection methods in TS
Gideon Simpson
- [petsc-users] DMCreateGlobalVector in fortran.
Manuel Valera
- [petsc-users] Invitation à se connecter sur LinkedIn
Soundes MARZOUGUI
- [petsc-users] Replacement for Euclid ?
Michel Kern
- [petsc-users] missing types in petscsysdef.h
Lukas van de Wiel
- [petsc-users] Where to restrict MPI Communicator
Florian Lindner
- [petsc-users] pc gamg did not converge in sinv where it used to
Denis Davydov
- [petsc-users] ksp solve error with nested matrix
Manav Bhatia
- [petsc-users] petscprint problem
Sharp Stone
- [petsc-users] configure PETSc on Cray
Sharp Stone
- [petsc-users] TSSetPostStep for projected methods
Gideon Simpson
- [petsc-users] VecScatter betweeen Vectors with different parallel layouts
Barletta, Ivano
- [petsc-users] TS routine calling order
Gideon Simpson
- [petsc-users] TS question 1: how to stop explicit methods because they do not use SNES(VI)?
Ed Bueler
- [petsc-users] TS question 2: why not -ts_adapt_wnormtype 1 ?
Ed Bueler
- [petsc-users] lock matrices in EPS
Kong, Fande
- [petsc-users] Scientific Software Days Conference, April, 2017
Damon McDougall
- [petsc-users] PETSc set off-diagonal matrix pre-allocation
Andrew Ho
- [petsc-users] PC HYPRE BoomerAMG options for nodal
Bernardo Rocha
- [petsc-users] prefix (i.e. cumulative) sum
Gideon Simpson
- [petsc-users] Newbie question : iterative solver - algorithm and performance
lixin chu
- [petsc-users] multigrid questions
Matt Landreman
- [petsc-users] FLOPS vs FLOPS/sec from PETSc -log_summary
Ajit Desai
- [petsc-users] A way to distribute 3D arrays.
Manuel Valera
- [petsc-users] Newbie question : sequential and parallel storage, and memory estimation
lixin chu
- [petsc-users] usefulness of fieldsplit_type schur and fieldsplit_schur_type user
David Nolte
- [petsc-users] Krylov-Schur Tolerance
Christopher Pierce
- [petsc-users] PETSc user
Mahmud Hasan
- [petsc-users] PCSETUP_FAILED due to FACTOR_NUMERIC_ZEROPIVOT
lixin chu
- [petsc-users] Newbie question : sequential vs. parallel and matrix creation
lixin chu
- [petsc-users] GAMG huge hash being requested
Justin Chang
- [petsc-users] Newbie question : complex number with MatView ?
lixin chu
- [petsc-users] Non-hermitian KSP
Bikash Kanungo
- [petsc-users] understanding the LSC preconditioner
David Nolte
- [petsc-users] Multigrid with defect correction
Matt Landreman
- [petsc-users] implementing non-local scalar field
Swenson, Jennifer
- [petsc-users] make file issues
Gideon Simpson
- [petsc-users] implementing non-local scalar field -- updated
Jennifer Swenson
- [petsc-users] can I still use collectives like MPI_Bcast, MPI_Scatter on Petsc vectors?
Fangbo Wang
- [petsc-users] python3 patch
Gideon Simpson
- [petsc-users] python3 update
Gideon Simpson
- [petsc-users] consider an error message as a petscinfo??
Kong, Fande
- [petsc-users] ghost points, DMs, and vector operations
Gideon Simpson
- [petsc-users] low dimensional sub-problem
Gideon Simpson
- [petsc-users] Multi-domain meshes with DMPLEX
Santos Teixeira Frederico
- [petsc-users] Why my petsc program get stuck and hang there when assembling the matrix?
Fangbo Wang
- [petsc-users] Number of unknowns/proc for scaling tests
Casalegno Francesco
- [petsc-users] issue with ghost points
Gideon Simpson
- [petsc-users] default TS RK
Gideon Simpson
- [petsc-users] petsc4py and MPI.COMM_SELF.Spawn
Rodrigo Felicio
Last message date:
Tue Feb 28 19:45:18 CST 2017
Archived on: Wed Mar 1 10:24:56 CST 2017
This archive was generated by
Pipermail 0.09 (Mailman edition).