Skip to content

Vector memory limit reached on Mac M1 #293

@metalichen

Description

@metalichen

Hi there!

I ran into a problem with running sleuth on an MacBook Air with a M1 chip and 16 Gb RAM. When I'm trying to read in kallisto data, I get an error:

ec_so <- sleuth_prep(kal_dirs,num_cores=5,extra_bootstrap_summary = F,filter_target_id = target_id,transformation_function = function(x) log2(x + 0.1))
reading in kallisto results
dropping unused factor levels
.Error: vector memory limit of 16.0 Gb reached, see mem.maxVSize()

I tried increasing the memory limit with mem.maxVSize(), but even if I set the limit to ludicrous 400Gb, I still get the same error .Error: vector memory limit of 400.0 Gb reached, see mem.maxVSize(). If I set the limit to infinity, I finally get a different behaviour and R crashes with caught segfault *** address 0x100000cfeedfaef, cause 'invalid permissions'

I don't think that this is a genuine issue with a lack of memory: I get the same error even then I'm trying to load just one kallisto folder, and the same script worked perfectly fine with the same data on my old MacBook Pro with an Intel chip and 8 Gb RAM (which I alas no longer have access to).

I tried reinstalling sleuth with all dependencies but got the same error again. The R version is 4.4.3.

I'd appreciate any help, getting desperate here

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions