It looks as if we need roughly this amount to memory:
2*c*length(tsave) + # for filter and smoother
2*c * max(diff(tsave) + # reconstructed steps in smoother
k + # others temp arrays
c*m # distances of receivers
where c, k are constants,m the number of receivers.
what to do if this is too large?
- optimise
tsave
- run in
leaps
- use unified memory?
remember:
hobs <steps < jump < leap
It looks as if we need roughly this amount to memory:
where c, k are constants,m the number of receivers.
what to do if this is too large?
tsaveleapsremember:
hobs <steps < jump < leap