Hey, I'm currently running new script for SpatialCellChat (CellChat V3) on a large 10x Spatial Trascriptimics HD 3’ dataset and encountering issues with filterProbability(). My chat.rds object is ~24.9 GB, and the function either hangs indefinitely or crashes, even after increasing memory limits.
Code used:
options(future.globals.maxSize = 220 * 1024^3)
plan(multisession, workers = 8)
chat <- filterProbability(chat)
#> >>> Filter out non-significant communication with a probability quantile being 0.95 for each L-R pair...
#> ℹ Future strategy in use: parallel with 2 workers
#> ✔ Filtering is done.
chat <- filterCommunication(
chat, min.cells = NULL,
min.links = 10,
min.cells.sr = 10
)
Is there a recommended strategy for large datasets?
Thanks!