Exiting solver based on timeout #1164
Replies: 2 comments
-
Hi @will-gerard, Crocoddyl solvers don't use timings to interrupt the computation. However, this is something that I could implement shortly. In the meantime, I am keen to hear from you more about your requirements. There are a few design choices to be made when introducing this feature. For instance, we can check timings once per loop, twice (backward and forward passes) or even more. There are also different choices for measuring time: real, user, system, or, hopefully not, wall clock with different precisions. I am also particularly concerned with overheads, and honestly, I am not super clear on this topic yet. |
Beta Was this translation helpful? Give feedback.
-
Thank you for your response. What we would like to do is use the ddp solver within an MPC loop operating at a particular control frequency, something like 500Hz or 1kHz. And so I'd like to run the solver for a particular length of time, say 2 ms, and then at that point take the last trajectory returned by the solver and use that for the next MPC step. I can try to get comparable behavior, I think, by running a bunch of solves to get an average iteration time t, and capping max iters to a little under 2 ms / t, for example, but a timeout on the solver itself would be more direct. Checking this time once per loop should be sufficient I would think, but in a couple of different places, like after the forward and backward pass as you say, would also work - I don't have a specific requirement here. In terms of type of timing, I think we would want monotonic time rather than wall clock time. At the same time, probably system time rather than user time, to account for all operations on the processor, rather than just this individual program, to make the simulation as realistic as possible. The sort of time returned by the CLOCK_MONOTONIC linux syscall basically, this is what we use elsewhere today anyway. You know better than I do though, I'm sure, so what do you think, does this sound reasonable? |
Beta Was this translation helpful? Give feedback.
-
We are looking to limit the length of time the DDP solver can run for. I see that we can change max iters from the default of 100 by passing the parameter to the solve function, and also it appears we can change the exit tolerance via set_th_stop if I am understanding correctly. There does not appear to be any setting which results in a timeout for the solver. At first I thought doing something to stoppingCriteria would do it, but I don't see any way to change the functionality of this other than extending with a new solver class. I just was hoping to double check and see that I am not missing anything - is this accurate?
Beta Was this translation helpful? Give feedback.
All reactions