Skip to content

manage memory demand #24

@edwardlavender

Description

@edwardlavender

It looks as if we need roughly this amount to memory:

2*c*length(tsave) +    # for filter and smoother
2*c * max(diff(tsave) +  # reconstructed steps in smoother
k +  # others temp arrays
c*m  # distances of receivers

where c, k are constants,m the number of receivers.

what to do if this is too large?

  • optimise tsave
  • run in leaps
  • use unified memory?

remember:
hobs <steps < jump < leap

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions