-
Notifications
You must be signed in to change notification settings - Fork 0
Description
I am doing an experimental test where I am running a big data regression in memory on a machine with relatively low RAM (32gb) but with a huge swap SSD (300gb).
Of course this is perhaps not a good idea; but I would like to give it a try and see actual benchmarks.
The problem is that node doesn't seem to be able to use swap, even though swapon -s confirms that 300gb of swap is available. I've run node as per your recommendation:
node --expose-gc --max_semi_space_size=4480 --max_old_space_size=286720 --max_executable_size=143360 src/main.js
The system has enough physical and virtual RAM:

I created a simple memtest.js script to see if node will start using swap:
const targetBytes = 500 * 1024 * 1024 * 1024; // 500gb
const arrLength = Math.pow(2, 32) - 1;
const arrLength2 = 20000;
let arr = new Array(arrLength);
for (var i = 0, il = arrLength; i < il; i++) {
let arr2 = new Array(arrLength2);
for (var i2 = 0, il2 = arrLength2; i2 < il2; i2++) {
arr2[i2] = Math.random();
}
arr[i] = arr2;
if (i % 100 === 0) {
console.log(process.memoryUsage().heapUsed / 1024 / 1024);
}
}
The result is:
Initially it seems to go well. But then at this point it will hit the physical RAM amount and for some reason throw an out of memory exception:
Your tests specifically test for virtual memory. Do you have any clue as to why it still throws an OOM at the physical memory limit? Why doesn't it start to use more swap?


