[petsc-users] High Memory Usage with PETSc BDF Solver Compared to scikits.odes CVODE
Jed Brown
jed at jedbrown.org
Tue Jun 17 09:57:07 CDT 2025
You can send email to petsc-maint at mcs.anl.gov, which is kept confidential.
Art <mac3bar at gmail.com> writes:
> Hi Barry,
>
> Thank you for your assistance . Since this is an ongoing development
> project, I would prefer not to share the full code publicly at this stage
> (public petsc-users). I’d be glad to share it with you privately via email,
> if that's an option you’re open to.
>
> Thanks again for your support
>
> thanks
>
> Art
>
> El lun, 16 jun 2025 a las 20:51, Barry Smith (<bsmith at petsc.dev>) escribió:
>
>>
>> Can you please post the entire code so we can run it ourselves to
>> reproduce the problem.
>>
>> I suspect the code maybe using a dense matrix to represent the
>> Jacobian. This ill require a huge amount of memory. Sundials is likely
>> using Jacobian free Newton Krylov to solve the linear system. You can do
>> this also with PETSc but it might not be defaulting to it. With the code I
>> can quickly check what is happening.
>>
>> Barry
>>
>>
>> > On Jun 16, 2025, at 2:05 PM, Art <mac3bar at gmail.com> wrote:
>> >
>> > Hi everyone,
>> >
>> > I’m porting a code from scikits.odes.sundials.cvode to PETSc, since it
>> is compatible with FEniCSx and can run in parallel with MPI. First, I used
>> Petsc with the "rk" solver, and it worked well, both serially and in
>> parallel for a system with 14000 nodes (42000 dofs). However, when using
>> an implicit solver like bdf, the solver takes up all the memory (16 gb),
>> even on a small system. To do this, use this:
>> >
>> >
>> > def ifunction(self, ts, t, y, ydot, f):
>> >
>> >
>> y.ghostUpdate(PETSc.InsertMode.INSERT_VALUES,PETSc.ScatterMode.FORWARD)
>> > y.copy(result=self.yv.x.petsc_vec)
>> > self.yv.x.scatter_forward()
>> >
>> > dydt = self.rhs(self.yv)
>> > dydt.x.scatter_forward()
>> >
>> > ydot.copy(result=f)
>> > f.axpy(-1.0, dydt.x.petsc_vec)
>> >
>> > return 0
>> >
>> >
>> > y = y0.petsc_vec.copy()
>> > ts.setType(ts.Type.BDF)
>> > ts.setIFunction(ifunction)
>> > ts.setTime(0.0)
>> > ts.setTimeStep(1e-14)
>> > ts.setStepLimits(1e-17,1e-12)
>> > ts.setMaxTime(1.0e-12)
>> > ts.setExactFinalTime(PETSc.TS.ExactFinalTime.STEPOVER)
>> > ts.setTolerances(rtol=1e-6, atol=1e-6)
>> > snes = ts.getSNES()
>> > ksp = snes.getKSP()
>> > ksp.setType("gmres")
>> > ksp.getPC().setType("none")
>> > ksp.setFromOptions()
>> >
>> > For the scikits.odes.sundials.cvode library, in serial mode, I have
>> used:
>> >
>> > solver = CVODE(rhs,
>> > old_api=False,
>> > linsolver='spgmr',
>> > rtol=1e-6,
>> > atol=1e-6,
>> > max_steps=5000,
>> > order=2)
>> >
>> > In this case, the solver worked perfectly and obtained similar results
>> to the rk solver in PETSC. I suspect the issue might be related to the way
>> the Jacobian is built in PETSC, but scikits.odes.sundials.cvode works
>> perfectly without requiring the Jacobian. I would greatly appreciate any
>> suggestions or examples on how to properly set up the BDF solver with PETSc.
>> > Thanks
>> > Art
>>
>>
More information about the petsc-users
mailing list