-
-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Description
I've been using Eximia and have been very pleased with its performance and simplicity.
However, I'd like to use Eximia to operate on large documents in a memory-constrained environment (AWS Lambda)
The parsing seems to eagerly process all of the XML input which consumes a lot of memory and places a hard limit on the size of input that can be processed. For example, if I load a 29MiB input document, my Lambda reports a memory usage of 780MiB.
Would it be possible to have an option to consume the stream of XML tokens lazily, say via a lazy seq?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels