Skip to content

"Out of memory" error when query compression is not used #6

@ilnamkang

Description

@ilnamkang

I've encountered "Out of memory" error when I analyzed about 10,000 metagenome reads against 'nr-20140917-cablastx' database using cablastp-xsearch.
(The program wan run on Ubuntu server with 128 GB memory and 16 cores.)

And, it seems that the memory error could be avoided by using query compression (by setting --compress-query="true" in command line).

Should I use query compression to analyze a large number of metagenome reads by cablastp-xsearch?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions