Skip to content

Garbage collecting processed data. #21

@M2-

Description

@M2-

Hello,

Currently I'm using your software to push roughly 400,000 integers, thew 9 different Excerpts gateways, which means about 45,000 nodes /excerpt. This system works as crossfire between read and write, for two different applications. One pushes the data, while one waits for there to be data, and extracts it from the excerpt. Basically the problem is with so much data being pushed and potentially Queued, there Excerpt heap gets pretty large, and needs to be cleared from what was read by the reading portion of the system(data that'll never be read or used again). So once data has been read from the excerpt gateway, how does it get deleted, or basically Garbage collected, without clearing potentially Queued data? I can't seem to find such a method.

Thank you for your time.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions