Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Time stepping in IDA - minimise solver restarts #4312

Closed
4 tasks
martinjrobins opened this issue Aug 2, 2024 · 2 comments · Fixed by #4351
Closed
4 tasks

Time stepping in IDA - minimise solver restarts #4312

martinjrobins opened this issue Aug 2, 2024 · 2 comments · Fixed by #4351
Assignees

Comments

@martinjrobins
Copy link
Contributor

proposal by @MarcBerliner (#3910 (comment))

Time stepping

In PyBaMM, the solver currently stops at all values in the t_eval vector. It seems that we use t_eval and set some dt for a few distinct purposes:

  1. To enforce time-dependent discontinuities within a single step, like with a drive cycle
  2. To set the data collection frequency (as in the period experiment kwarg)
  3. Setting a small dt to force better solver convergence (take smaller time steps)

These three reasons for setting a dt are all valid, but stopping the simulation at all time steps can have a drastic impact on the adaptive time stepping and performance. For example, consider a full C/10 discharge with

  • a. t_eval = [0, 36000] (i.e., no stops)
  • b. t_eval = np.arange(0, 36060, 60) (pybamm default)

If we compare the solver stats,

  • Number of steps: a. 165 vs b. 715
  • Number of residual calls: a. 288 vs b. 823
  • Number of linear solver setups: a. 28 vs b. 91
  • Number of nonlinear iterations: a. 286 vs b. 821
  • DAE integration time: a. 25.5 ms vs b. 97.1 ms

Even though we solve the same system, the dense t_eval b. is nearly 4x slower! To address these issues, I propose the following changes that align the time-stepping options with Julia's differential equation solvers (see Output Control):

  • (Breaking) By default, save every t and y determined by IDA's adaptive time stepping algorithm. This eliminates issues like the one above with the C/10 discharge. We can accurately post-interpolate the solution onto specific times with IDA's Hermite interpolator. This is a huge benefit for parameter estimation because we will always obtain the full solution accuracy regardless of the data collection frequency.
  • (Non-breaking) Change the description of t_eval to match Julia's tstops: "Denotes extra times that the time-stepping algorithm must step to. This should be used to help the solver deal with discontinuities and singularities, since stepping exactly at the time of the discontinuity will improve accuracy." With this option, drive cycles in 1. still work
  • (Non-breaking) Add a solver option that matches Julia's saveat: "Denotes specific times to save the solution at, during the solving phase. The solver will save at each of the timepoints in this array in the most efficient manner available to the solver." This addresses the data collection frequency in 2. without negatively affecting performance since it interpolates the solution with IDAGetDky().
  • (Non-breaking) Discourage modifying t_eval for performance issues and encourage modifying appropriate solver options (rtol, atol, dt_max, etc.), which addresses 3.
@martinjrobins
Copy link
Contributor Author

By default, save every t and y determined by IDA's adaptive time stepping algorithm. This eliminates issues like the one above with the C/10 discharge.

So this y would be the state matrix stored in the pybamm.Solution class?

We can accurately post-interpolate the solution onto specific times with IDA's Hermite interpolator

are you thinking this would this happen in the pybamm.ProcessedVariable class? You'd need to expose the sundials interpolator via pybind11 then.

@MarcBerliner
Copy link
Member

MarcBerliner commented Aug 2, 2024

So this y would be the state matrix stored in the pybamm.Solution class?

Yes.

are you thinking this would this happen in the pybamm.ProcessedVariable class? You'd need to expose the sundials interpolator via pybind11 then.

In short, yes. I'm still looking into the details, but this is a bit complicated because we may need access to ida_mem in python if we cannot extract IDA's interpolant. Currently, we free ida_mem at the end of each solve, so we cannot do any post-interpolation with IDAGetDky(ida_mem, ...). My thinking is to add a kwarg that lets us retain ida_mem without freeing it and also (for general convenience) add python bindings to all the IDAGetx(), IDASetx(), etc. functions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants