Problem
When processing large ALMA CO data cubes (>20GB, ~4096×4096×500 channels), several functions cause memory errors or system crashes by attempting to load entire cubes into memory at once.
Running CASA 6.6.1.17 on a system with 250GB RAM.
Affected Functions
casaMosaicRoutines.generate_weight_file() - Uses myia.getchunk() to load entire cube
casaMosaicRoutines.mosaic_aligned_data() - Uses myia.replacemaskedpixels(); also immath() LEL expressions fail on large image lists
scConvolution.coverage_collapser() - cube.sum fails on large cubes without allow_huge_operations = True
scConvolution.smooth_cube() - cube.convolve_to() loads entire cube for convolution
Likely non-exhaustive list; this is just what broke my data processing. This may be addressed with a move to SpectralCube + dask; in the meantime they're all patchable with channel-wise processing.
Problem
When processing large ALMA CO data cubes (>20GB, ~4096×4096×500 channels), several functions cause memory errors or system crashes by attempting to load entire cubes into memory at once.
Running CASA 6.6.1.17 on a system with 250GB RAM.
Affected Functions
casaMosaicRoutines.generate_weight_file()- Usesmyia.getchunk()to load entire cubecasaMosaicRoutines.mosaic_aligned_data()- Usesmyia.replacemaskedpixels(); alsoimmath()LEL expressions fail on large image listsscConvolution.coverage_collapser()-cube.sumfails on large cubes withoutallow_huge_operations = TruescConvolution.smooth_cube()-cube.convolve_to()loads entire cube for convolutionLikely non-exhaustive list; this is just what broke my data processing. This may be addressed with a move to SpectralCube + dask; in the meantime they're all patchable with channel-wise processing.