Skip to content
This repository was archived by the owner on Dec 8, 2025. It is now read-only.
This repository was archived by the owner on Dec 8, 2025. It is now read-only.

Reading or editing a file causes a large memory spike #56

@neubig

Description

@neubig

When reading in a large file, we see a memory spike.

Probably this function would need to be made more general to read only the parts of the file that are necessary for the particular command:

def read_file(self, path: Path) -> str:
"""
Read the content of a file from a given path; raise a ToolError if an error occurs.
"""
try:
return path.read_text()
except Exception as e:
raise ToolError(f'Ran into {e} while trying to read {path}') from None

We should:

  1. write tests that measure the memory usage when reading in or editing large files and see that they fail currently (e.g. read in a 10MB file and see that it requires over 10MB of memory)
  2. modify the existing implementation to reduce memory requirements when editing only a small part of a larger file
  3. run tests to make sure that they pass

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions